Exam DP-100: Designing and Implementing a Data Science Solution on Azure
Exam Number: DP-100 | Length of test: 120 mins |
Exam Name: Designing and Implementing a Data Science Solution on Azure | Number of questions in the actual exam: 40-60 |
Format: PDF, VPLUS | Passing Score: 700/1000 |
Total Questions: 433 $30 Premium PDF file 2 months updates Last updated: November-2024 |
Total Questions: 433 FREE Premium VPLUS file Last updated: November-2024 |
Download practice test questions – DP-100 exam topic
Title | Size | Hits | Download |
---|---|---|---|
Microsoft.DP-100.vSep-2024.by.Andien.178q | 11.05 MB | 46 | Download |
Microsoft.DP-100.vSep-2024.by.Andien.178q | 17.23 MB | 30 | Download |
Microsoft.DP-100.vJul-2024.by.Isanco.262q | 14.97 MB | 64 | Download |
Microsoft.DP-100.vJul-2024.by.Isanco.262q | 22.63 MB | 53 | Download |
Microsoft.DP-100.vJan-2024.by.Stephan.228q | 16.28 MB | 69 | Download |
Microsoft.DP-100.vOct-2023.by.Linh.209q | 11.31 MB | 60 | Download |
Study guide for Exam DP-100: Designing and Implementing a Data Science Solution on Azure
Audience profile
As a candidate for this exam, you should have subject matter expertise in applying data science and machine learning to implement and run machine learning workloads on Azure.
Your responsibilities for this role include:
- Designing and creating a suitable working environment for data science workloads.
- Exploring data.
- Training machine learning models.
- Implementing pipelines.
- Running jobs to prepare for production.
- Managing, deploying, and monitoring scalable machine learning solutions.
As a candidate for this exam, you should have knowledge and experience in data science by using:
- Azure Machine Learning
- MLflow
Skills at a glance
Design and prepare a machine learning solution (20–25%)
- Design a machine learning solution
- Manage an Azure Machine Learning workspace
- Manage data in an Azure Machine Learning workspace
- Manage compute for experiments in Azure Machine Learning
Explore data, and train models (35–40%)
- Explore data by using data assets and data stores
- Create models by using the Azure Machine Learning designer
- Use automated machine learning to explore optimal models
- Use notebooks for custom model training
- Tune hyperparameters with Azure Machine Learning
Prepare a model for deployment (20–25%)
- Run model training scripts
- Implement training pipelines
- Manage models in Azure Machine Learning
Deploy and retrain a model (10–15%)
- Deploy a model
- Apply machine learning operations (MLOps) practices
Some new sample questions:
Question:
DRAG DROP
You manage an Azure Machine Learning workspace That has an Azure Machine Learning datastore.
Data must be loaded from the following sources:
* a credential-less Azure Blob Storage
* an Azure Data Lake Storage (ADLS) Gen 2 which is not a credential-less datastore
You need to define the authentication mechanisms to access data in the Azure Machine Learning datastore.
Which data access mechanism should you use? To answer, move the appropriate data access mechanisms to the correct storage types. You may use each data access mechanism once, more than once, or not at all. You may need to move the split bar between panes or scroll to view content.
Question:
You have an Azure Machine Learning workspace. You plan to tune model hyperparameters by using a sweep job.
You need to find a sampling method that supports early termination of low-performance jobs and continuous hyperpara meters.
Solution: Use the Bayesian sampling method over the hyperparameter space.
Does the solution meet the goal?
A. Yes
B. No
Question:
You manage an Azure Machine Learning workspace.
You must set up an event-driven process to trigger a retraining pipeline.
You need to configure an Azure service that will trigger a retraining pipeline in response to data drift in Azure Machine Learning datasets. Which Azure service should you use?
A. Event Grid
B. Azure Functions
C. Event Hubs
D. Logic Apps
Question:
You use an Azure Machine Learning workspace.
You must monitor cost at the endpoint and deployment level.
You have a trained model that must be deployed as an online endpoint. Users must authenticate by using Microsoft Entra ID.
What should you do?
A. Deploy the model lo Azure Kubernetes Service (AKS). During deployment, set the token_auth_mode parameter of the target configuration object to true.
B. Deploy the model to a managed online endpoint. During deployment, set the token_auth_mode parameter of the target configuration object to true.
C. Deploy the model to Azure Kubernetes Service (AKS). During deployment, set the auth.mode parameter to configure the authentication type.
D. Deploy the model to a managed online endpoint. During deployment, set the auth_mode parameter to configure the authentication type.
……….
Some new questions:
Q
You manage an Azure Machine Learning workspace named Workspace1.
You plan to create a pipeline in the Azure Machine Learning Studio designer. The pipeline must include a custom component You need to ensure the custom component can be used in the pipeline. What should you do first.
A. Add a linked service to Workspace1.
B. Create a pipeline endpoint.
C. Upload a json file to Workspace1.
D. Upload a yaml file to Workspace1.
E. Create a datastore.
Q
You use Azure Machine Learning Designer lo load the following datasets into an experiment:
Dataset1:
Dataset2:
You need to create a dataset that has the same columns and header row as the input datasets and contains all rows from both input datasets.
Solution: Use the Add Rows component.
Does the solution meet the goal?
A. Yes
B. No
Q
You create an Azure Machine Learning workspace named woricspace1. The workspace contains a Python SDK v2 notebook that uses MLflow to collect model training metrics and artifacts from your local computer.
You must reuse the notebook to run on Azure Machine Learning compute instance in workspace1.
You need to continue to log metrics and artifacts from your data science code.
What should you do?
A. Configure the tracking URI.
B. Instantiate the job class.
C. Log into workspace’!.
D. Instantiate the MLCIient class.
Q
You manage an Azure Machine Learning workspace. You plan to import data from Azure Data Lake Storage Gen2. You need to build a URI that represents the storage location. Which protocol should you use?
A. abfss
B. https
C. adl
D. wasbs
………….
Exam DP-100 valid?
Hi,
Exam DP-100 is valid now.
Rate +89%.
tks