Tieturi järjestää nyt koulutuksen:

DP-200: Implementing an Azure Data Solution (3 pv)

In this course, the students will implement various data platform technologies into solutions that are in line with business and technical requirements including on-premises, cloud, and hybrid data scenarios incorporating both relational and No-SQL data. They will also learn how to process data using a range of technologies and languages for both streaming and batch data.
The students will also explore how to implement data security including authentication, authorization, data policies and standards. They will also define and implement data solution monitoring for both the data storage and data processing activities. Finally, they will manage and troubleshoot Azure data solutions which includes the optimization and disaster recovery of big data, batch processing and streaming data solutions.


Tar­get Group
The primary audience for this course is data professionals, data architects, and business intelligence professionals who want to learn about the data platform technologies that exist on Microsoft Azure.

The secondary audience for this course is individuals who develop applications that deliver content from the data platform technologies that exist on Microsoft Azure.

Pre­requi­si­tes
In addition to their professional experience, students who take this training should have technical knowledge equivalent to the following courses:


Con­tents of DP-200: Imple­men­ting an Azu­re Data So­lu­tion

Course begins at 9.00 and ends at 16.-16.30. Breakfast is served from 8.15 onwards.

Mo­du­le 1: Azu­re for the Data En­gi­neer
This module explores how the world of data has evolved and how cloud data platform technologies are providing new opportunities for business to explore their data in different ways. The student will gain an overview of the various data platform technologies that are available, and how a Data Engineers role and responsibilities has evolved to work in this new world to an organization benefit

Lessons
– Explain the evolving world of data
– Survey the services in the Azure Data Platform
– Identify the tasks that are performed by a Data Engineer
– Describe the use cases for the cloud in a Case Study

Lab : Azu­re for the Data En­gi­neer
– Identify the evolving world of data
– Determine the Azure Data Platform Services
– Identify tasks to be performed by a Data Engineer
– Finalize the data engineering deliverables

Af­ter comple­ting this mo­du­le, stu­dents will be able to:
– Explain the evolving world of data
– Survey the services in the Azure Data Platform
– Identify the tasks that are performed by a Data Engineer
– Describe the use cases for the cloud in a Case Study

Mo­du­le 2: Wor­king with Data Sto­ra­ge
This module teaches the variety of ways to store data in Azure. The Student will learn the basics of storage management in Azure, how to create a Storage Account, and how to choose the right model for the data you want to store in the cloud. They will also understand how data lake storage can be created to support a wide variety of big data analytics solutions with minimal effort.

Lessons
– Choose a data storage approach in Azure
– Create an Azure Storage Account
– Explain Azure Data Lake storage
– Upload data into Azure Data Lake

Lab : Wor­king with Data Sto­ra­ge
– Choose a data storage approach in Azure
– Create a Storage Account
– Explain Data Lake Storage
– Upload data into Data Lake Store

Af­ter comple­ting this mo­du­le, stu­dents will be able to:
– Choose a data storage approach in Azure
– Create an Azure Storage Account
– Explain Azure Data Lake Storage
– Upload data into Azure Data Lake

Mo­du­le 3: Enabling Team Ba­sed Data Science with Azu­re Da­ta­bricks
This module introduces students to Azure Databricks and how a Data Engineer works with it to enable an organization to perform Team Data Science projects. They will learn the fundamentals of Azure Databricks and Apache Spark notebooks; how to provision the service and workspaces and learn how to perform data preparation task that can contribute to the data science project.

Lessons
– Explain Azure Databricks
– Work with Azure Databricks
– Read data with Azure Databricks
– Perform transformations with Azure Databricks

Lab : Enabling Team Ba­sed Data Science with Azu­re Da­ta­bricks
– Explain Azure Databricks
– Work with Azure Databricks
– Read data with Azure Databricks
– Perform transformations with Azure Databricks

Af­ter comple­ting this mo­du­le, stu­dents will be able to:
– Explain Azure Databricks
– Work with Azure Databricks
– Read data with Azure Databricks
– Perform transformations with Azure Databricks

Mo­du­le 4: Buil­ding Glo­bal­ly Di­stri­bu­ted Da­ta­ba­ses with Cos­mos DB
In this module, students will learn how to work with NoSQL data using Azure Cosmos DB. They will learn how to provision the service, and how they can load and interrogate data in the service using Visual Studio Code extensions, and the Azure Cosmos DB .NET Core SDK. They will also learn how to configure the availability options so that users are able to access the data from anywhere in the world.

Lessons
– Create an Azure Cosmos DB database built to scale
– Insert and query data in your Azure Cosmos DB database
– Build a .NET Core app for Cosmos DB in Visual Studio Code
– Distribute your data globally with Azure Cosmos DB

Lab : Buil­ding Glo­bal­ly Di­stri­bu­ted Da­ta­ba­ses with Cos­mos DB
– Create an Azure Cosmos DB
– Insert and query data in Azure Cosmos DB
– Build a .Net Core App for Azure Cosmos DB using VS Code
– Distribute data globally with Azure Cosmos DB

Af­ter comple­ting this mo­du­le, stu­dents will be able to:
– Create an Azure Cosmos DB database built to scale
– Insert and query data in your Azure Cosmos DB database
– Build a .NET Core app for Azure Cosmos DB in Visual Studio Code
– Distribute your data globally with Azure Cosmos DB

Mo­du­le 5: Wor­king with Re­la­tio­nal Data Sto­res in the Cloud
In this module, students will explore the Azure relational data platform options including SQL Database and SQL Data Warehouse. The student will be able explain why they would choose one service over another, and how to provision, connect and manage each of the services.

Lessons
– Use Azure SQL Database
– Describe Azure SQL Data Warehouse
– Creating and Querying an Azure SQL Data Warehouse
– Use PolyBase to Load Data into Azure SQL Data Warehouse

Lab : Wor­king with Re­la­tio­nal Data Sto­res in the Cloud
– Use Azure SQL Database
– Describe Azure SQL Data Warehouse
– Creating and Querying an Azure SQL Data Warehouse
– Use PolyBase to Load Data into Azure SQL Data Warehouse

Af­ter comple­ting this mo­du­le, stu­dents will be able to:
– Use Azure SQL Database
– Describe Azure Data Warehouse
– Creating and Querying an Azure SQL Data Warehouse
– Using PolyBase to Load Data into Azure SQL Data Warehouse

Mo­du­le 6: Per­for­ming Real-Time Ana­ly­tics with Stream Ana­ly­tics
In this module, students will learn the concepts of event processing and streaming data and how this applies to Events Hubs and Azure Stream Analytics. The students will then set up a stream analytics job to stream data and learn how to query the incoming data to perform analysis of the data. Finally, you will learn how to manage and monitor running jobs.

Lessons
– Explain data streams and event processing
– Data Ingestion with Event Hubs
– Processing Data with Stream Analytics Jobs

Lab : Per­for­ming Real-Time Ana­ly­tics with Stream Ana­ly­tics
– Explain data streams and event processing
– Data Ingestion with Event Hubs
– Processing Data with Stream Analytics Jobs

Af­ter comple­ting this mo­du­le, stu­dents will be able to:
– Explain data streams and event processing
– Data Ingestion with Event Hubs
– Processing Data with Stream Analytics Jobs

Mo­du­le 7: Orc­he­stra­ting Data Mo­ve­ment with Azu­re Data Fac­to­ry
In this module, students will learn how Azure Data factory can be used to orchestrate the data movement and transformation from a wide range of data platform technologies. They will be able to explain the capabilities of the technology and be able to set up an end to end data pipeline that ingests and transforms data.

Lessons
– Explain how Azure Data Factory works
– Azure Data Factory Components
– Azure Data Factory and Databricks

Lab : Orc­he­stra­ting Data Mo­ve­ment with Azu­re Data Fac­to­ry
– Explain how Data Factory Works
– Azure Data Factory Components
– Azure Data Factory and Databricks

Af­ter comple­ting this mo­du­le, stu­dents will be able to:
– Azure Data Factory and Databricks
– Azure Data Factory Components
– Explain how Azure Data Factory works

Mo­du­le 8: Secu­ring Azu­re Data Plat­forms
In this module, students will learn how Azure provides a multi-layered security model to protect your data. The students will explore how security can range from setting up secure networks and access keys, to defining permission through to monitoring across a range of data stores.

Lessons
– An introduction to security
– Key security components
– Securing Storage Accounts and Data Lake Storage
– Securing Data Stores
– Securing Streaming Data

Lab : Secu­ring Azu­re Data Plat­forms
– An introduction to security
– Key security components
– Securing Storage Accounts and Data Lake Storage
– Securing Data Stores
– Securing Streaming Data

Af­ter comple­ting this mo­du­le, stu­dents will be able to:
– An introduction to security
– Key security components
– Securing Storage Accounts and Data Lake Storage
– Securing Data Stores
– Securing Streaming Data

Mo­du­le 9: Mo­ni­to­ring and Troubles­hoo­ting Data Sto­ra­ge and Proces­sing
In this module, the student will get an overview of the range of monitoring capabilities that are available to provide operational support should there be issue with a data platform architecture. They will explore the common data storage and data processing issues. Finally, disaster recovery options are revealed to ensure business continuity.

Lessons
– Explain the monitoring capabilities that are available
– Troubleshoot common data storage issues
– Troubleshoot common data processing issues
– Manage disaster recovery

Lab : Mo­ni­to­ring and Troubles­hoo­ting Data Sto­ra­ge and Proces­sing
– Explain the monitoring capabilities that are available
– Troubleshoot common data storage issues
– Troubleshoot common data processing issues
– Manage disaster recovery

Af­ter comple­ting this mo­du­le, stu­dents will be able to:
– Explain the monitoring capabilities that are available
– Troubleshoot common data storage issues
– Troubleshoot common data processing issues
– Manage disaster recovery

—————————-

Koulutuspaikka:
Tieturi Oy, Mannerheimintie 15, Helsinki

Ajankohta:
18.-20.03.2020

Hinta:
1 800 euroa
/hlö + alv 24 %.

Ilmoittautumisen peruutusehdot:
Alle 14 päivää ennen koulutusta tehdyistä perumisista perimme osallistumismaksusta 50 %. Alle 7 päivää ennen koulutusta tehdyistä perumisista perimme osallistumismaksun täysimääräisenä. Mikäli et saavu koulutukseen, veloitamme koko osallistumismaksun ja mahdolliset jo tilatut testit.
Pidätämme oikeuden muutoksiin kansainvälisten koulutuskumppaniemme koulutuksiin. Näiden koulutusten osalta olemme sinuun yhteydessä ilmoittautumisesi jälkeen.