Join us for AWS Database Modernization Week!

About the Event

Organizations running legacy, on-premises data, or self-managing in the cloud still have to take on management tasks such as database provisioning, patching, configuration, or backups. AWS Database Modernization Week is a series of virtual sessions designed to empower technical practitioners to migrate applications and databases to AWS. You will gain a deep understanding of how to leverage purpose-built databases to build highly scalable, highly available, and distributed applications.

Join us for any or all of these technical sessions to learn about AWS Database services through presentations and practical lab demonstrations. We will explore architecture, new features, and management best practices to help you remove roadblocks and accelerate your data migration and modernization projects.


Who Should Attend

This event is ideal for technical decision makers who are looking for guidance on how to use AWS Database services to get the fastest performance at the highest scale. This is primarily a 300- to 400-level (advanced) workshop, including presentations and demos designed for those experienced with the AWS console.

The intended audience for this event includes IT Leaders, DevOps Leaders, Developers, Architects, Database Administrators, and Product Solution Managers.

Event Agenda

Event Agenda

  • Mon, Jul 26
  • Tues, Jul 27
  • Wed, Jul 28
  • Thurs, Jul 29
  • Day 5
  • Track6 Session 1
  • Track7 Session 1
  • Track8 Session 1
  • Track9 Session 1
  • Track10 Session 1

Mon, Jul 26

Purpose Built Databases for Modern Applications

Monday, July 26, 2021, 9:00 AM - 1:00 PM PT | 12:00 PM - 4:00 PM ET

Best Practices for Designing Cost-Effective Solutions in Amazon DynamoDB (Level 300-400)
9:00 AM - 11:00 AM PT | 12:00 PM - 2:00 PM ET
Cost is a key factor in every good solution, and working with Amazon DynamoDB is no exception. In this session you will learn the best practices for designing cost-effective DynamoDB solutions, starting with common pitfalls around estimating cost, and how to avoid them. We will dive deep into how to choose the optimal capacity options for your use case, cost optimization tips, and estimating cost based on your data read and write patterns. At the end of this session you will be equipped with the knowledge and steps you need to estimate cost and design cost-effective solutions in DynamoDB.

Modernize How You Run Apache Cassandra Workloads by Using Amazon Keyspaces (Level 300-400)
11:00 AM - 12:00 PM PT | 2:00 PM - 3:00 PM ET
Learn how Amazon Keyspaces helps you run and scale your mission-critical, operational Apache Cassandra workloads more easily by using a highly available, fully managed, and serverless Cassandra-compatible database service. Join us for a deep dive into key features such as end-to-end encryption and continuous backups as well as best practices for migrating existing Cassandra workloads to Amazon Keyspaces.

Real-Time Monitoring of DevOps Workloads Using Telegraf, Amazon Timestream and Grafana (Level 300)
12:00 PM - 1:00 PM PT | 3:00 PM - 4:00 PM ET
Real-time monitoring solutions provide intelligence to find anomalous machine and user behavior. These solutions enable locating fraudsters in a financial system, troubleshoot poor customer experience on a web application, or flag potential security breaches. In this session we will focus on DevOps workloads and demo how to build a real-time monitoring solution using Telegraf, Amazon Timestream and Grafana to monitor a fleet of EC2 instances.

Tues, Jul 27

Nonrelational Databases

Tuesday, July 27, 2021, 9:00 AM - 1:00 PM PT | 12:00 PM - 4:00 PM ET

How to Migrate Your Workloads Into Amazon DocumentDB (Level 300 - 400)
9:00 AM - 10:30 AM PT | 12:00 PM - 1:30 PM ET
This session will start with a quick refresher on Amazon DocumentDB (with MongoDB compatibility) and discuss why customers are migrating, the tools available to migrate, the different phases involved, and the various methods to migrate to DocumentDB. We will then discuss best practices to follow and dive into a hands-on lab exercise to walk through a step by step migration together.

Let Me Graph That For You... with Amazon Neptune (Level 300 - 400)
10:30 AM - 12:00 PM PT | 1:30 PM - 3:00 PM ET
This session will dive deep into Amazon Neptune, a fully managed graph database service that makes it easy to build and run applications that work with highly connected datasets. In this session you will learn how you can build your own solution for knowledge graph, identity graph, and fraud graph use cases. We will also show how to leverage the strength, weight, and quality of the relationships within your data.

A Hands-On Introduction to Amazon Managed Blockchain (Level 300)
12:00 PM - 1:00 PM PT | 3:00 PM - 4:00 PM ET
Blockchain applications enable multiple organizations to visualize data and manage complex workflows in real-time, without relying on third parties or intermediaries. But deploying and managing a blockchain can be challenging. Amazon Managed Blockchain (AMB) takes care of much of the heavy lifting, reducing total cost of ownership and allowing customers to focus on building their applications. AMB supports two common open-source protocols for public and private blockchain networks: Hyperledger Fabric and Ethereum. In addition to providing an introduction to the service, this session will provide the learner with hands-on experience deploying blockchain applications on these two popular protocols.

Wed, Jul 28

Move to Managed Relational Databases

Wednesday, July 28, 2021, 9:00 AM - 1:00 PM PT | 12:00 PM - 4:00 PM ET

Overcoming Oracle Cost Traps With AWS - by House of Brick, Advanced AWS Partner (Level 200)
9:00 AM - 9:45 AM PT | 12:00 PM - 12:45 PM ET
Many organizations that want the benefits of AWS are hesitant to migrate Oracle database workloads due to the cost traps and risks associated with Oracle licensing. In this session, House of Brick, Oracle Licensing Specialist and AWS Advanced Consulting Partner, will address common Oracle cost traps, and the fear, uncertainty and doubt that Oracle customers face in today’s market. They will also offer suggestions on how to overcome them with AWS solutions.

Dive Deep Into Amazon RDS for Oracle Advanced Features (Level 300)
9:45 AM - 10:45 AM PT | 12:45 PM - 1:45 PM ET
In this session, we will dive deep into how to handle high availability, disaster recovery, backup, and monitoring in Amazon RDS for Oracle. We will provide a technical overview of each feature accompanied by a live demo.

Amazon RDS for SQL Server Cost Optimization and Migration Best Practices (Level 300)
10:45 AM - 11:30 AM PT | 1:45 PM - 2:30 PM ET
This session will cover best practices for migrating SQL Server workloads to Amazon RDS for SQL Server, including opportunities to optimize cost. We will discuss a phased migration approach and discuss tools to simplify the migration journey.

Demo: Assess and Migrate Self-Managed Database Workloads to Amazon RDS for SQL Server (Level 300)
11:30 AM - 12:15 PM PT | 2:30 PM - 3:15 PM ET
Offload your undifferentiated administration tasks by migrating self-managed SQL Server workloads to Amazon RDS for SQL Server. Learn how you can use native backup/restore to efficiently migrate to RDS SQL Server, and use AWS Database Migration Service (DMS) to minimize downtime. Also, learn how to use cross-region point-in-time restore to design multi-region Disaster Recovery strategies.

Turbocharge Amazon RDS with Amazon ElastiCache for Redis (Level 300)
12:15 PM - 1:00 PM PT | 3:15 PM - 4:00 PM ET
Learn how Amazon ElastiCache can deliver high throughput and low latency for your data-driven Amazon RDS applications, providing massive scale with sub-millisecond response times. Review a standard caching strategy, see a demonstration of caching in action, and learn how to quickly complement RDS with ElastiCache.

Thurs, Jul 29

Break Free from Legacy Databases

Thursday, July 29, 2021, 9:00 AM - 1:00 PM PT | 12:00 PM - 4:00 PM ET

What's New with Amazon Aurora (Level 300)
9:00 AM - 10:00 AM PT | 12:00 PM - 1:00 PM ET
Amazon Aurora is a MySQL- and PostgreSQL-compatible relational database that is fully-managed and offers the speed, reliability, and availability of commercial databases at a fraction of the cost. In this session, you will learn about new features of Amazon Aurora such as enhancements to Global Database, Serverless, and Graviton2. You will also see demos to better understand how these features work.

Move to Managed: Migrating Open Source Databases From Self-Managed to Amazon RDS and Amazon Aurora (Level 300)
10:00 AM - 11:00 AM PT | 1:00 PM - 2:00 PM ET
Self-managing your open source databases may seem convenient, but is there is an easier, more scalable, and more cost-effective way? Yes, there is... with fully managed database services. In this session, learn how to efficiently migrate your self-managed MySQL and PostgreSQL databases from on-premises or EC2 to Amazon RDS and Amazon Aurora. You will also see a demo that provides step-by-step guidance on how to successfully migrate. AWS partner Datavail will also share about a real-world migration to Amazon Aurora.

Best Practices for Migrating from Oracle to Amazon Aurora (Level 300 - 400)
11:00 AM - 1:00 PM PT | 2:00 PM - 4:00 PM ET
Reduce your dependency on commercial databases to increase agility and lower cost by migrating to Amazon Aurora. In this session, you will learn best practices and how to avoid common pitfalls when migrating from Oracle to Amazon Aurora. We’ll cover AWS Database Migration Service (DMS) and AWS Schema Conversion Tool (SCT), as well as additional resources and programs to support your Database Freedom migrations. You will also see a live demo of how to leverage SCT and DMS to complete a heterogeneous database migration in the following steps: run an SCT assessment and interpret the output, use SCT to convert the database schema from Oracle to PostgreSQL syntax, run DMS to load the target database and validate the results.

Day 5

Tab 5 content

Integer ultrices lacus sit amet lorem viverra consequat. Vivamus lacinia interdum sapien non faucibus. Maecenas bibendum, lectus at ultrices viverra, elit magna egestas magna, a adipiscing mauris justo nec eros.

Track6 Session 1

Tab 6 content

Integer ultrices lacus sit amet lorem viverra consequat. Vivamus lacinia interdum sapien non faucibus. Maecenas bibendum, lectus at ultrices viverra, elit magna egestas magna, a adipiscing mauris justo nec eros.

Track7 Session 1

Tab 7 content

Organizations today are looking to free themselves from the constraints of on-premises databases and leverage the power of fully managed databases in the cloud. Amazon RDS is a fully managed relational database service that you can use to run your choice of database engines including open source engines, Oracle, and SQL Server in the cloud. Amazon RDS automates time-consuming database administration tasks and adds capabilities such as replication and Multi-AZ failover to make your database deployments more scalable, available, reliable, manageable, and cost-effective. This session covers why you should consider moving your on-premises Oracle & SQL Server deployments to Amazon RDS and the tools to get started.

Track8 Session 1

Tab 8 content

Organizations today are looking to free themselves from the constraints of on-premises databases and leverage the power of fully managed databases in the cloud. Amazon RDS is a fully managed relational database service that you can use to run your choice of database engines including open source engines, Oracle, and SQL Server in the cloud. Amazon RDS automates time-consuming database administration tasks and adds capabilities such as replication and Multi-AZ failover to make your database deployments more scalable, available, reliable, manageable, and cost-effective. This session covers why you should consider moving your on-premises Oracle & SQL Server deployments to Amazon RDS and the tools to get started.

Track9 Session 1

Tab 9 content

Organizations today are looking to free themselves from the constraints of on-premises databases and leverage the power of fully managed databases in the cloud. Amazon RDS is a fully managed relational database service that you can use to run your choice of database engines including open source engines, Oracle, and SQL Server in the cloud. Amazon RDS automates time-consuming database administration tasks and adds capabilities such as replication and Multi-AZ failover to make your database deployments more scalable, available, reliable, manageable, and cost-effective. This session covers why you should consider moving your on-premises Oracle & SQL Server deployments to Amazon RDS and the tools to get started.

Track10 Session 1

Tab 10 content

Organizations today are looking to free themselves from the constraints of on-premises databases and leverage the power of fully managed databases in the cloud. Amazon RDS is a fully managed relational database service that you can use to run your choice of database engines including open source engines, Oracle, and SQL Server in the cloud. Amazon RDS automates time-consuming database administration tasks and adds capabilities such as replication and Multi-AZ failover to make your database deployments more scalable, available, reliable, manageable, and cost-effective. This session covers why you should consider moving your on-premises Oracle & SQL Server deployments to Amazon RDS and the tools to get started.

Multiple City Registration

  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa

Venue

Session Proficiency Levels Explained

  • Level 100

    Introductory

    Sessions will focus on providing an overview of AWS services and features, with the assumption that attendees are new to the topic

  • Level 200

    Intermediate

    Sessions will focus on providing best practices, details of service features and demos with the assumption that attendees have introductory knowledge of the topics

  • Level 300

    Advanced

    Sessions will dive deeper into the selected topic. Presenters assume that the audience has some familiarity with the topic, but may or may not have direct experience implementing a similar solution

  • Level 400

    Expert

    Sessions are for attendees who are deeply familiar with the topic, have implemented a solution on their own already, and are comfortable with how the technology works across multiple services, architectures, and implementations

About the Event

Which Day(s) Should I Attend?

Speakers

  • Raju Gulabani, VP of Databases, Analytics & AI, AWS

    Raju Gulabani is VP of Databases, Analytics & AI within AWS at Amazon.com. He is responsible for P&<, product management, engineering and operations for Database services such as Amazon Aurora and Amazon DynamoDB, and Analytics services such as Amazon Redshift and Amazon EMR, as well as AI services like Amazon Lex, Amazon Polly, and Amazon Rekognition. Prior to joining Amazon in his current position in 2010, Raju spent four years at Google and built the Google Apps business (now known as G Suite).Earlier in his career, Raju founded an Intel backed Wi-Fi Voice over IP company as well as held engineering management positions at Microsoft.

  • Ryan Kelly, Data Architect, Equinox

    Ryan Kelly is a data architect at Equinox, where he helps outline and implement frameworks for data initiatives. He also leads clickstream tracking which helps aid teams with insights on their digital initiatives. Ryan loves making it easier for people to reach and ingest their data for the purposes of business intelligence, analytics, and product/service enrichment. He also loves exploring and vetting new technologies to see how they can enhance what they do at Equinox

  • Richard Boyd, Cloud Data Engineer, iRobot

    Richard Boyd is a cloud data engineer with the iRobot Corporation’s Cloud Data Platform where he builds tools and services to support the world’s most beloved vacuum cleaner. Before joining iRobot, Richard built discrete event simulators for Amazon’s automated fulfillment centers in Amazon Robotics. His previous roles include cyber warfare systems analyst at MIT and research for the Center for Army Analysis. He holds advanced degrees in Applied Mathematics & Statistics.

  • Raju Gulabani, VP of Databases, Analytics & AI, AWS

    Raju Gulabani is VP of Databases, Analytics & AI within AWS at Amazon.com. He is responsible for P&<, product management, engineering and operations for Database services such as Amazon Aurora and Amazon DynamoDB, and Analytics services such as Amazon Redshift and Amazon EMR, as well as AI services like Amazon Lex, Amazon Polly, and Amazon Rekognition. Prior to joining Amazon in his current position in 2010, Raju spent four years at Google and built the Google Apps business (now known as G Suite).Earlier in his career, Raju founded an Intel backed Wi-Fi Voice over IP company as well as held engineering management positions at Microsoft.

  • Ryan Kelly, Data Architect, Equinox

    Ryan Kelly is a data architect at Equinox, where he helps outline and implement frameworks for data initiatives. He also leads clickstream tracking which helps aid teams with insights on their digital initiatives. Ryan loves making it easier for people to reach and ingest their data for the purposes of business intelligence, analytics, and product/service enrichment. He also loves exploring and vetting new technologies to see how they can enhance what they do at Equinox

  • Richard Boyd, Cloud Data Engineer, iRobot

    Richard Boyd is a cloud data engineer with the iRobot Corporation’s Cloud Data Platform where he builds tools and services to support the world’s most beloved vacuum cleaner. Before joining iRobot, Richard built discrete event simulators for Amazon’s automated fulfillment centers in Amazon Robotics. His previous roles include cyber warfare systems analyst at MIT and research for the Center for Army Analysis. He holds advanced degrees in Applied Mathematics & Statistics.

  • Raju Gulabani, VP of Databases, Analytics & AI, AWS

    Raju Gulabani is VP of Databases, Analytics & AI within AWS at Amazon.com. He is responsible for P&<, product management, engineering and operations for Database services such as Amazon Aurora and Amazon DynamoDB, and Analytics services such as Amazon Redshift and Amazon EMR, as well as AI services like Amazon Lex, Amazon Polly, and Amazon Rekognition. Prior to joining Amazon in his current position in 2010, Raju spent four years at Google and built the Google Apps business (now known as G Suite).Earlier in his career, Raju founded an Intel backed Wi-Fi Voice over IP company as well as held engineering management positions at Microsoft.

  • Ryan Kelly, Data Architect, Equinox

    Ryan Kelly is a data architect at Equinox, where he helps outline and implement frameworks for data initiatives. He also leads clickstream tracking which helps aid teams with insights on their digital initiatives. Ryan loves making it easier for people to reach and ingest their data for the purposes of business intelligence, analytics, and product/service enrichment. He also loves exploring and vetting new technologies to see how they can enhance what they do at Equinox

  • Richard Boyd, Cloud Data Engineer, iRobot

    Richard Boyd is a cloud data engineer with the iRobot Corporation’s Cloud Data Platform where he builds tools and services to support the world’s most beloved vacuum cleaner. Before joining iRobot, Richard built discrete event simulators for Amazon’s automated fulfillment centers in Amazon Robotics. His previous roles include cyber warfare systems analyst at MIT and research for the Center for Army Analysis. He holds advanced degrees in Applied Mathematics & Statistics.

  • Raju Gulabani, VP of Databases, Analytics & AI, AWS

    Raju Gulabani is VP of Databases, Analytics & AI within AWS at Amazon.com. He is responsible for P&<, product management, engineering and operations for Database services such as Amazon Aurora and Amazon DynamoDB, and Analytics services such as Amazon Redshift and Amazon EMR, as well as AI services like Amazon Lex, Amazon Polly, and Amazon Rekognition. Prior to joining Amazon in his current position in 2010, Raju spent four years at Google and built the Google Apps business (now known as G Suite).Earlier in his career, Raju founded an Intel backed Wi-Fi Voice over IP company as well as held engineering management positions at Microsoft.

  • Ryan Kelly, Data Architect, Equinox

    Ryan Kelly is a data architect at Equinox, where he helps outline and implement frameworks for data initiatives. He also leads clickstream tracking which helps aid teams with insights on their digital initiatives. Ryan loves making it easier for people to reach and ingest their data for the purposes of business intelligence, analytics, and product/service enrichment. He also loves exploring and vetting new technologies to see how they can enhance what they do at Equinox

  • Richard Boyd, Cloud Data Engineer, iRobot

    Richard Boyd is a cloud data engineer with the iRobot Corporation’s Cloud Data Platform where he builds tools and services to support the world’s most beloved vacuum cleaner. Before joining iRobot, Richard built discrete event simulators for Amazon’s automated fulfillment centers in Amazon Robotics. His previous roles include cyber warfare systems analyst at MIT and research for the Center for Army Analysis. He holds advanced degrees in Applied Mathematics & Statistics.

Customer Highlights

Epic Logo Image

Epics Games’ entire analytics platform runs on AWS. Billions of game events, like player interactions on the map, their accuracy, damage taken and dealt, and what resources they are using are all sent to AWS.

Epics Games

Yelp Logo Image

Epics Games’ entire analytics platform runs on AWS. Billions of game events, like player interactions on the map, their accuracy, damage taken and dealt, and what resources they are using are all sent to AWS.

Yelp

Airbnb Logo Image

Epics Games’ entire analytics platform runs on AWS. Billions of game events, like player interactions on the map, their accuracy, damage taken and dealt, and what resources they are using are all sent to AWS.

Airbnb

Lyft Image

Epics Games’ entire analytics platform runs on AWS. Billions of game events, like player interactions on the map, their accuracy, damage taken and dealt, and what resources they are using are all sent to AWS.

Lyft

FAQs

  1. Where is this event?
  2. Do I have to register for the entire week?
  3. How much does this event cost?
  4. What are the prerequisites before attending the event?

Q: Where is this event?

A: This event is an online event, hosted by AWS on the GoToWebinar platform.

Q: Do I have to register for the entire week?

A: No, you can register for the day(s) that work best for you.

Q: How much does this event cost?

A: There is no cost to attend this event.

Q: What are the prerequisites before attending the event?

A: There are no prerequisites for attending the event. We encourage attendees to browse the Databases page on the AWS website to get a brief overview of the services available to them.