About the event

Customers using AWS Containers and Serverless technologies have significantly decreased the time it takes to develop and deploy new applications. T-Mobile improved build and deploy times at least 90% faster, General Electric increased deployment frequency from once or twice every 2 weeks to multiple times per day, and Vanguard  increased speed to market of microservices from 3 months to 24 hours. Equip yourself with the same knowledge by joining one these free hands-on workshops led by AWS experts. You will to learn how to build, deploy, and scale cloud native applications faster and more efficiently and leave the workshop with the skills needed to build your own prototype.



Who should attend?

Anyone interested in getting hands-on experience with building cloud native applications using serverless or container technology.

  • Cloud Architects
  • Cloud Engineers
  • DevOps Leaders
  • Developers
  • Infrastructure Administrators
  • IT Professionals


Knowledge of AWS cloud and software development is helpful.

Register for this event here:

Temporibus Autem Quibusdam Aut

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea. commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id

  • At vero eos et accusamus et iusto odio dignissimos
  • Ducimus qui blanditiis praesentium voluptatum
  • Deleniti atque corrupti quos dolores et quas Molestias excepturi sint occaecati cupiditate non
  • At vero eos et accusamus et iusto odio dignissimos
  • Ducimus qui blanditiis praesentium voluptatum
  • Deleniti atque corrupti quos dolores et quas Molestias excepturi sint occaecati cupiditate non

Event Agenda

45 mins Keynote
  Track 1 - Get started: Moving your data to the cloud Track 2 - Building apps with modern databases Track 3 - Get started: Building your data lake Track 4 - Analyzing and getting insights
40 mins Migrate your Oracle and SQL Server databases to Amazon RDS (200) Which database to choose: Pick the right purpose-built database for the job (200) How to go from zero to data lake in days (200) Amazon Redshift use cases and deployment patterns (400)
Migrate your on-premises Data Warehouse to Amazon Redshift (200) Building large scale data-driven apps with AWS databases (300) Picking the right analytical engine for your needs (200) Processing Big Data with Hadoop, Spark, and other frameworks in Amazon EMR (300)
40 mins Why cloud databases like Amazon Aurora are more scalable and reliable (300) Extreme Performance at Cloud Scale: supercharge your real-time apps with Amazon ElastiCache (300) Breaking the silos: Extending your DW to your data lake (300) Scalable, secure log analytics with Amazon Elasticsearch Service (200)
Deploying open source databases on AWS (200) Databases for building business-critical enterprise apps (300) Big data in the era of heavy privacy regulations (200) Scalable, secure log analytics with Amazon Elasticsearch Service (200)
40 mins Why cloud databases like Amazon Aurora are more scalable and reliable (300) Extreme Performance at Cloud Scale: supercharge your real-time apps with Amazon ElastiCache (300) Breaking the silos: Extending your DW to your data lake (300) Scalable, secure log analytics with Amazon Elasticsearch Service (200)
50 mins Customer spotlight 1: How iRobot built the Roomba to use real-time data to smartly clean your home - 45 min or Customer spotlight 2: Equinox’s Data Warehouse modernization journey - 45 min Conclusion - 5 min

Event Agenda

  • If your data infrastructure is built using a data lake on AWS, you are already most of the way there towards extending your data for generative AI. Think of generative AI applications as a new application type that sits on top of an existing data foundation where you can plug in existing data sources, like your data in data lakes on Amazon S3 and systems that already exist to store and analyze data for other business applications. We will talk about assessing the data you have and how to apply it to a foundation model with Amazon Bedrock.

  • Speed matters in the data path between compute and storage to maximize valuable compute cycles during model training and inference. ML training can be accelerated with specialized compute, high performance storage, and simplification of the operations around ML workflows. We’ll show you ways to optimize performance for commonly used frameworks in AI/ML workflows.
  • One of the questions we hear from customers is “Should I use an existing foundation model (FM) or train a new model?” And often, the answer will be “use an existing FM” because these models are so capable and evolving so rapidly in their generalized knowledge and capabilities. Plus, you can customize either the responses of the model or the model itself with your own enterprise data. We’ll walk through Amazon Bedrock, which offers a choice of high-performing foundation models, and show how you can customize those FMs with your own data.

  • Part of the data strategy for generative AI is designing so that both the data inputs used to train and customize models, and the generated outputs of applications, include only information that is accurate and intended. Your goal is to be your own toughest auditor so you are prepared for whatever comes your way for compliance in the future. We will show you how to analyze data that is feeding your generative AI workflows, and how to implement safeguards to help prevent the use or display of unintended data and responses.

Agenda

Led by AWS Solutions Experts


  • Day1:
  • Day 2:
  • Track3 Session 1
  • Track4 Session 1
  • Track5 Session 1
  • Track6 Session 1
  • Track7 Session 1
  • Track8 Session 1
  • Track9 Session 1
  • Track10 Session 1

Day1:


Day 2:

Track3 Session 1

Tab 3 content

Nulla eleifend felis vitae velit tristique imperdiet. Etiam nec imperdiet elit. Pellentesque sem lorem, scelerisque sed facilisis sed, vestibulum sit amet eros.

Track4 Session 1

Tab 4 content

Integer ultrices lacus sit amet lorem viverra consequat. Vivamus lacinia interdum sapien non faucibus. Maecenas bibendum, lectus at ultrices viverra, elit magna egestas magna, a adipiscing mauris justo nec eros.

Track5 Session 1

Tab 5 content

Nulla eleifend felis vitae velit tristique imperdiet. Etiam nec imperdiet elit. Pellentesque sem lorem, scelerisque sed facilisis sed, vestibulum sit amet eros.

Track6 Session 1

Tab 6 content

Integer ultrices lacus sit amet lorem viverra consequat. Vivamus lacinia interdum sapien non faucibus. Maecenas bibendum, lectus at ultrices viverra, elit magna egestas magna, a adipiscing mauris justo nec eros.

Track7 Session 1

Tab 7 content

Organizations today are looking to free themselves from the constraints of on-premises databases and leverage the power of fully managed databases in the cloud. Amazon RDS is a fully managed relational database service that you can use to run your choice of database engines including open source engines, Oracle, and SQL Server in the cloud. Amazon RDS automates time-consuming database administration tasks and adds capabilities such as replication and Multi-AZ failover to make your database deployments more scalable, available, reliable, manageable, and cost-effective. This session covers why you should consider moving your on-premises Oracle & SQL Server deployments to Amazon RDS and the tools to get started.

Track8 Session 1

Tab 8 content

Organizations today are looking to free themselves from the constraints of on-premises databases and leverage the power of fully managed databases in the cloud. Amazon RDS is a fully managed relational database service that you can use to run your choice of database engines including open source engines, Oracle, and SQL Server in the cloud. Amazon RDS automates time-consuming database administration tasks and adds capabilities such as replication and Multi-AZ failover to make your database deployments more scalable, available, reliable, manageable, and cost-effective. This session covers why you should consider moving your on-premises Oracle & SQL Server deployments to Amazon RDS and the tools to get started.

Track9 Session 1

Tab 9 content

Organizations today are looking to free themselves from the constraints of on-premises databases and leverage the power of fully managed databases in the cloud. Amazon RDS is a fully managed relational database service that you can use to run your choice of database engines including open source engines, Oracle, and SQL Server in the cloud. Amazon RDS automates time-consuming database administration tasks and adds capabilities such as replication and Multi-AZ failover to make your database deployments more scalable, available, reliable, manageable, and cost-effective. This session covers why you should consider moving your on-premises Oracle & SQL Server deployments to Amazon RDS and the tools to get started.

Track10 Session 1

Tab 10 content

Organizations today are looking to free themselves from the constraints of on-premises databases and leverage the power of fully managed databases in the cloud. Amazon RDS is a fully managed relational database service that you can use to run your choice of database engines including open source engines, Oracle, and SQL Server in the cloud. Amazon RDS automates time-consuming database administration tasks and adds capabilities such as replication and Multi-AZ failover to make your database deployments more scalable, available, reliable, manageable, and cost-effective. This session covers why you should consider moving your on-premises Oracle & SQL Server deployments to Amazon RDS and the tools to get started.

Multiple City Registration

  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa

Venue

The Meteropolitian
233 south wacker drive, 67th floor, Chicago, Illinois 60606

Session Proficiency Levels Explained


  • Level 100

    Introductory

    Sessions will focus on providing an overview of AWS services and features, with the assumption that attendees are new to the topic

  • Level 200

    Intermediate

    Sessions will focus on providing best practices, details of service features and demos with the assumption that attendees have introductory knowledge of the topics

  • Level 300

    Advanced

    Sessions will dive deeper into the selected topic. Presenters assume that the audience has some familiarity with the topic, but may or may not have direct experience implementing a similar solution

  • Level 400

    Expert

    Sessions are for attendees who are deeply familiar with the topic, have implemented a solution on their own already, and are comfortable with how the technology works across multiple services, architectures, and implementations

About the event

AWS Pi Day is an annual celebration of the birthday for Amazon S3, making it a fitting moment to start from those 18 years of innovations and dive deep into the ways new generative AI applications will be built using that data. This developer event highlights hands-on services and steps that technologists can follow in building generative AI applications, explaining decisions practitioners will make in optioning their infrastructure for generative AI, selecting a foundation model, preparing their data, and how you can customize models using your own business data. Join us if you want to hear how to put your data to work with generative AI, and bring questions for our live chat!

Who should attend

If you are a developer or other hands-on practitioner making design and implementation decisions about systems, data, application design, and how the applications you’re engineering will be used by your business, AWS Pi Day is built for you. Generative AI involves operations workflows that influence developer decisions while applications are being designed and built. Selecting a foundation model (FM), deciding whether to customize it with your own data, how to implement that customization, and what tools and services to use along the way to build generative AI applications, are all part of the developer experience that Pi Day will walk you through.

Quis Nostrud Exercitation

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat.

Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est. laborum.

Excepteur Sint

Lorem ipsum dolor sit amet, elit, sed do ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex eaconsequat. Duis aute irure dolor in.

Featured Speakers

  • Rozal Singh

    Everett Oliven

    Head of Americas and Globals for SAP, AWS

    Everett Oliven, Head of Americas and Globals for SAP on AWS, has over 30 years of experience in management consulting, SAP Go to Market (GTM), and helping enterprise customers to transform their businesses. In his current role, he has been helping SAP customers to modernize their SAP applications and innovate with SAP on AWS.

  • David Kargman

    VP, NA S/4HANA CoE, SAP

    David Kargman is VP of SAP’s S/4HANA CoE at SAP. He has over 20 years of experience in value selling, software consulting and support, strategy and operations. He and his team of Solution Advisors work closely with customers to help them better understand the “why”, “what” & “how” of SAP S/4HANA.

  • Bob Betts

    Bob Betts, SAP Programs Director at American Air Filter

    Bob Betts is responsible for Project One Shot. American Air Filter (AAF) is a greenfield implementation and uses the RISE with SAP solution. One Shot includes SAP S/4HANA and the SAP Integrated Business Planning for Supply Chain, SAP Analytics Cloud, SAP Sales Cloud, SAP Service Cloud, and SAP CPQ solutions.

  • AV Vivekanandha

    SAP Solutions Architect, AWS

    AV Vivekanandha is a Solutions Architect for SAP working for AWS, and is part of the World-Wide Sales Organization (WWSO) team covering the South Central Region of the United States. He has 11 years of SAP technology experience with expertise in architecting SAP solutions, Sys-Ops implementation, and modernization of SAP workloads with AWS cloud.

  • Tarun Gulati

    SAP Specialized Sales - Canada, AWS

    Tarun Gulati is an SAP on AWS Canada Sales leader. With 15+ years of distinguished experience in business development, strategic partnerships and delivery of leading large scale SAP and Cloud enabled transformations and driving practice growth through integrated business and technology offerings.

  • Krishnakumar Ramadoss

    Solutions Architect - SAP on AWS, AWS

    Krishnakumar Ramadoss, aka KKR, is a Senior SAP Innovation Architect at AWS, covering SAP innovation topics on AWS across AMER regions. He has 18 years of SAP technology experience with expertise in application and integration development, implementing SAP solutions for different industry verticals, and architecting SAP solutions with the AWS cloud.

  • Bidwan Baruah

    Solutions Architect - SAP on AWS, AWS

    Bidwan Baruah is a SAP Specialist Solutions Architect with AWS who helps customers develop cloud migration and adoption strategies as related to mission-critical SAP workloads. He helps them architect, migrate, and operate SAP systems on AWS.

  • Beth Sharp

    Sr. Solutions Architect Manager - SAP on AWS, AWS

    Beth Sharp is the Sr. Manager of Americas Solution Architects at Amazon Web Services. Beth has over 25 years of SAP experience covering SAP implementations, functional application, technical and basis support, architecture and analytics. In her role, Beth and her team work with customers to develop cloud strategies, enabling them to transform their business by moving SAP landscapes to AWS.

  • Raju Gulabani, VP of Databases, Analytics & AI, AWS

    Raju Gulabani is VP of Databases, Analytics & AI within AWS at Amazon.com. He is responsible for P&<, product management, engineering and operations for Database services such as Amazon Aurora and Amazon DynamoDB, and Analytics services such as Amazon Redshift and Amazon EMR, as well as AI services like Amazon Lex, Amazon Polly, and Amazon Rekognition. Prior to joining Amazon in his current position in 2010, Raju spent four years at Google and built the Google Apps business (now known as G Suite).Earlier in his career, Raju founded an Intel backed Wi-Fi Voice over IP company as well as held engineering management positions at Microsoft.

  • Ryan Kelly, Data Architect, Equinox

    Ryan Kelly is a data architect at Equinox, where he helps outline and implement frameworks for data initiatives. He also leads clickstream tracking which helps aid teams with insights on their digital initiatives. Ryan loves making it easier for people to reach and ingest their data for the purposes of business intelligence, analytics, and product/service enrichment. He also loves exploring and vetting new technologies to see how they can enhance what they do at Equinox

  • Richard Boyd, Cloud Data Engineer, iRobot

    Richard Boyd is a cloud data engineer with the iRobot Corporation’s Cloud Data Platform where he builds tools and services to support the world’s most beloved vacuum cleaner. Before joining iRobot, Richard built discrete event simulators for Amazon’s automated fulfillment centers in Amazon Robotics. His previous roles include cyber warfare systems analyst at MIT and research for the Center for Army Analysis. He holds advanced degrees in Applied Mathematics & Statistics.

Customer Highlights

Epic Logo Image

Epics Games’ entire analytics platform runs on AWS. Billions of game events, like player interactions on the map, their accuracy, damage taken and dealt, and what resources they are using are all sent to AWS.

Epics Games

Yelp Logo Image

Epics Games’ entire analytics platform runs on AWS. Billions of game events, like player interactions on the map, their accuracy, damage taken and dealt, and what resources they are using are all sent to AWS.

Yelp

Airbnb Logo Image

Epics Games’ entire analytics platform runs on AWS. Billions of game events, like player interactions on the map, their accuracy, damage taken and dealt, and what resources they are using are all sent to AWS.

Airbnb

Lyft Image

Epics Games’ entire analytics platform runs on AWS. Billions of game events, like player interactions on the map, their accuracy, damage taken and dealt, and what resources they are using are all sent to AWS.

Lyft

FAQS

  1. Where is AWS Innovate hosted?
  2. What is the price of attending AWS Innovate?
  3. Who should attend AWS Innovate?
  4. How do I get the certificate of attendance?
Q: Where is this event?

A: This event is an online event, hosted by AWS on the INXPO platform.

Q: Who should attend this event?

A: Developers building data-driven apps; DBAs and data engineers who are building analytics infrastructure and data pipelines; Analysts and data scientists who are deriving insights that answer complex business quesions and building/trainining machine learning models.

Q: How much does this event cost?

A: There is no cost to attend this event.

Q: What are the prerequisites before attending the event?

A: There are no prerequisites for attending the event. We encourage attendees to browse the Database and Analytics pages on the AWS website to get a brief overview of the services available to them.