Microsoft Azure AI-100 Full Practice Tests Total Questions: 531 – 10 Mock ExamsÂ
Practice Set 1
Time limit: 0
0 of 36 questions completed
Questions:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
Information
Click on Start Quiz.
You have already completed the Test before. Hence you can not start it again.
Test is loading...
You must sign in or sign up to start the Test.
You have to finish following quiz, to start this Test:
Your results are here!! for" Azure AI Engineer Associate Practice Test 1 "
0 of 36 questions answered correctly
Your time:
Time has elapsed
Your Final Score is : 0
You have attempted : 0
Number of Correct Questions : 0 and scored 0
Number of Incorrect Questions : 0 and Negative marks 0
Average score
Your score
Designing and Implementing a Microsoft Azure AI Solution
You have attempted: 0
Number of Correct Questions: 0 and scored 0
Number of Incorrect Questions: 0 and Negative marks 0
You can review your answers by clicking view questions. Important Note : Open Reference Documentation Links in New Tab (Right Click and Open in New Tab).
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
Answered
Review
Question 1 of 36
1. Question
You want to add 20 friends to a collection, and you have four face images for each friend. Each face image is referred to as a face, and each friend is referred to as a person. In the Face API, what is this collection of friends called? (Choose one.)
Correct
A collection of friends, although technically a collection, is called a person group. A person group is list or collection of people (persons).
Incorrect
A collection of friends, although technically a collection, is called a person group. A person group is list or collection of people (persons).
Unattempted
A collection of friends, although technically a collection, is called a person group. A person group is list or collection of people (persons).
Question 2 of 36
2. Question
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You create several AI models in Azure Machine Learning Studio.
You deploy the models to a production environment.
You need to monitor the compute performance of the models.
Solution: You write a custom scoring script.
Does this meet the goal?
Correct
No.
While a custom scoring script can provide insights into model performance, it is not a direct and comprehensive solution for monitoring compute performance. A custom script would require additional development and maintenance, and it might not capture all relevant metrics or provide real-time insights.
To effectively monitor compute performance, it’s recommended to leverage Azure Machine Learning’s built-in monitoring capabilities. These capabilities provide valuable metrics and visualizations, such as CPU utilization, memory usage, and latency, which can help you identify performance bottlenecks and optimize your models.
Incorrect
No.
While a custom scoring script can provide insights into model performance, it is not a direct and comprehensive solution for monitoring compute performance. A custom script would require additional development and maintenance, and it might not capture all relevant metrics or provide real-time insights.
To effectively monitor compute performance, it’s recommended to leverage Azure Machine Learning’s built-in monitoring capabilities. These capabilities provide valuable metrics and visualizations, such as CPU utilization, memory usage, and latency, which can help you identify performance bottlenecks and optimize your models.
Unattempted
No.
While a custom scoring script can provide insights into model performance, it is not a direct and comprehensive solution for monitoring compute performance. A custom script would require additional development and maintenance, and it might not capture all relevant metrics or provide real-time insights.
To effectively monitor compute performance, it’s recommended to leverage Azure Machine Learning’s built-in monitoring capabilities. These capabilities provide valuable metrics and visualizations, such as CPU utilization, memory usage, and latency, which can help you identify performance bottlenecks and optimize your models.
Question 3 of 36
3. Question
In both the Face API and the Emotion API, which of the following terms describes the rectangular coordinates of a face that’s detected in an image? (Choose one.)
Correct
In the Face API and the Emotion API, rectangular face coordinates are called a location. The location includes the top, left, height, and width of a region in the image that displays a face.
Incorrect
In the Face API and the Emotion API, rectangular face coordinates are called a location. The location includes the top, left, height, and width of a region in the image that displays a face.
Unattempted
In the Face API and the Emotion API, rectangular face coordinates are called a location. The location includes the top, left, height, and width of a region in the image that displays a face.
Question 4 of 36
4. Question
You are developing a Computer Vision application.
You plan to use a workflow that will load data from an on-premises database to Azure Blob storage, and then connect to an Azure Machine Learning service.
What should you use to orchestrate the workflow?
Correct
Explanation: With Azure Data Factory you can use workflows to orchestrate data integration and data transformation processes at scale. Build data integration, and easily transform and integrate big data processing and machine learning with the visual interface.References: https://azure.microsoft.com/en-us/services/data-factory/
Incorrect
Explanation: With Azure Data Factory you can use workflows to orchestrate data integration and data transformation processes at scale. Build data integration, and easily transform and integrate big data processing and machine learning with the visual interface.References: https://azure.microsoft.com/en-us/services/data-factory/
Unattempted
Explanation: With Azure Data Factory you can use workflows to orchestrate data integration and data transformation processes at scale. Build data integration, and easily transform and integrate big data processing and machine learning with the visual interface.References: https://azure.microsoft.com/en-us/services/data-factory/
Question 5 of 36
5. Question
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have Azure IoT Edge devices that generate streaming data.
On the devices, you need to detect anomalies in the data by using Azure Machine Learning models. Once an anomaly is detected, the devices must add information about the anomaly to the Azure IoT Hub stream.Solution: You expose a Machine Learning model as an Azure web service.
Does this meet the goal?
Correct
Explanation: Instead use Azure Stream Analytics and REST API. Note. Available in both the cloud and Azure IoT Edge, Azure Stream Analytics offers built-in machine learning based anomaly detection capabilities that can be used to monitor the two most commonly occurring anomalies: temporary and persistent. Stream Analytics supports user-defined functions, via REST API, that call out to Azure Machine Learning endpoints. References: https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-machine-learning-anomaly-detection
Incorrect
Explanation: Instead use Azure Stream Analytics and REST API. Note. Available in both the cloud and Azure IoT Edge, Azure Stream Analytics offers built-in machine learning based anomaly detection capabilities that can be used to monitor the two most commonly occurring anomalies: temporary and persistent. Stream Analytics supports user-defined functions, via REST API, that call out to Azure Machine Learning endpoints. References: https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-machine-learning-anomaly-detection
Unattempted
Explanation: Instead use Azure Stream Analytics and REST API. Note. Available in both the cloud and Azure IoT Edge, Azure Stream Analytics offers built-in machine learning based anomaly detection capabilities that can be used to monitor the two most commonly occurring anomalies: temporary and persistent. Stream Analytics supports user-defined functions, via REST API, that call out to Azure Machine Learning endpoints. References: https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-machine-learning-anomaly-detection
Question 6 of 36
6. Question
When analyzing an image using the Computer Vision API, you can specify visual feature types to return. Which of the following is a valid visual feature type when calling the ‘analyze’ operation?
Correct
If the parameter ‘smartCropping’ is true, the API applies smart cropping to center the thumbnail on the ‘region of interest’. Otherwise, it crops to the center.
Incorrect
If the parameter ‘smartCropping’ is true, the API applies smart cropping to center the thumbnail on the ‘region of interest’. Otherwise, it crops to the center.
Unattempted
If the parameter ‘smartCropping’ is true, the API applies smart cropping to center the thumbnail on the ‘region of interest’. Otherwise, it crops to the center.
Question 7 of 36
7. Question
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You are developing an application that uses an Azure Kubernetes Service (AKS) cluster.
You are troubleshooting a node issue.
You need to connect to an AKS node by using SSH.
Solution: You create a managed identity for AKS, and then you create an SSH connection.
Does this meet the goal?
What two values do you need to know in order to make a call to the Prediction API from client code?
Correct
The URL identifies the endpoint for the connection and the Prediction Key authorizes your app to access the service.
Incorrect
The URL identifies the endpoint for the connection and the Prediction Key authorizes your app to access the service.
Unattempted
The URL identifies the endpoint for the connection and the Prediction Key authorizes your app to access the service.
Question 9 of 36
9. Question
When creating a new Custom Vision Service project, what is a domain used for?
Correct
The domain gives the classifier more information about the expected content of the images.
Incorrect
The domain gives the classifier more information about the expected content of the images.
Unattempted
The domain gives the classifier more information about the expected content of the images.
Question 10 of 36
10. Question
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You are developing an application that uses an Azure Kubernetes Service (AKS) cluster.
You are troubleshooting a node issue.
You need to connect to an AKS node by using SSH.Solution: You add an SSH key to the node, and then you create an SSH connection.
Does this meet the goal?
Correct
Explanation: By default, SSH keys are generated when you create an AKS cluster. If you did not specify your own SSH keys when you created your AKS cluster, add your public SSH keys to the AKS nodes. You also need to create an SSH connection to the AKS node. References: https://docs.microsoft.com/en-us/azure/aks/ssh
Incorrect
Explanation: By default, SSH keys are generated when you create an AKS cluster. If you did not specify your own SSH keys when you created your AKS cluster, add your public SSH keys to the AKS nodes. You also need to create an SSH connection to the AKS node. References: https://docs.microsoft.com/en-us/azure/aks/ssh
Unattempted
Explanation: By default, SSH keys are generated when you create an AKS cluster. If you did not specify your own SSH keys when you created your AKS cluster, add your public SSH keys to the AKS nodes. You also need to create an SSH connection to the AKS node. References: https://docs.microsoft.com/en-us/azure/aks/ssh
Question 11 of 36
11. Question
You’re integrating the Computer Vision API into your solution. You created a cognitive services account for the Computer Vision service in the eastus region. Which of the following is the correct address for you to access the ocroperation?
Correct
The endpoint specifies the region you chose during sign up, the service URL, and a resource used on the request.
Incorrect
The endpoint specifies the region you chose during sign up, the service URL, and a resource used on the request.
Unattempted
The endpoint specifies the region you chose during sign up, the service URL, and a resource used on the request.
Question 12 of 36
12. Question
Which RBAC role should you assign to the Key Managers group
Correct
Correct Option:
B. Cognitive Services Contributor: This role provides permissions to manage cognitive services resources, including creating and managing deployments, keys, and other configurations.
Incorrect Options:
A. Security Administrator: This role is more focused on managing security policies and configurations rather than managing cognitive services.
C. Cognitive Services User: This role is typically for users who need to consume cognitive services but not manage them.
D. Security Manager: This role is also more aligned with security management rather than managing cognitive services.
Incorrect
Correct Option:
B. Cognitive Services Contributor: This role provides permissions to manage cognitive services resources, including creating and managing deployments, keys, and other configurations.
Incorrect Options:
A. Security Administrator: This role is more focused on managing security policies and configurations rather than managing cognitive services.
C. Cognitive Services User: This role is typically for users who need to consume cognitive services but not manage them.
D. Security Manager: This role is also more aligned with security management rather than managing cognitive services.
Unattempted
Correct Option:
B. Cognitive Services Contributor: This role provides permissions to manage cognitive services resources, including creating and managing deployments, keys, and other configurations.
Incorrect Options:
A. Security Administrator: This role is more focused on managing security policies and configurations rather than managing cognitive services.
C. Cognitive Services User: This role is typically for users who need to consume cognitive services but not manage them.
D. Security Manager: This role is also more aligned with security management rather than managing cognitive services.
Question 13 of 36
13. Question
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have Azure IoT Edge devices that generate streaming data.
On the devices, you need to detect anomalies in the data by using Azure Machine Learning models. Once an anomaly is detected, the devices must add information about the anomaly to the Azure IoT Hub stream.
Solution: You deploy Azure Stream Analytics as an IoT Edge module.
Does this meet the goal?
Correct
Explanation: Available in both the cloud and Azure IoT Edge, Azure Stream Analytics offers built-in machine learning based anomaly detection capabilities that can be used to monitor the two most commonly occurring anomalies: temporary and persistent. Stream Analytics supports user-defined functions, via REST API, that call out to Azure Machine Learning endpoints. References: https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-machine-learning-anomaly-detection
Incorrect
Explanation: Available in both the cloud and Azure IoT Edge, Azure Stream Analytics offers built-in machine learning based anomaly detection capabilities that can be used to monitor the two most commonly occurring anomalies: temporary and persistent. Stream Analytics supports user-defined functions, via REST API, that call out to Azure Machine Learning endpoints. References: https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-machine-learning-anomaly-detection
Unattempted
Explanation: Available in both the cloud and Azure IoT Edge, Azure Stream Analytics offers built-in machine learning based anomaly detection capabilities that can be used to monitor the two most commonly occurring anomalies: temporary and persistent. Stream Analytics supports user-defined functions, via REST API, that call out to Azure Machine Learning endpoints. References: https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-machine-learning-anomaly-detection
Question 14 of 36
14. Question
Your company has factories in 10 countries. Each factory contains several thousand IoT devices.
The devices present status and trending data on a dashboard.
You need to ingest the data from the IoT devices into a data warehouse.
Which two Microsoft Azure technologies should you use? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
Correct
The two Microsoft Azure technologies that you should use to ingest the data from the IoT devices into a data warehouse are:
Azure Stream Analytics
Azure Data Factory
Azure Stream Analytics is a real-time analytics platform that can process large volumes of streaming data from IoT devices. It can filter, aggregate, and transform the data before storing it in a data warehouse.
Azure Data Factory is a cloud-based ETL tool that can be used to orchestrate the data ingestion process. It can schedule data flows, transform data, and load it into a data warehouse.
By using these two technologies together, you can effectively ingest the data from the IoT devices into a data warehouse.
Incorrect
The two Microsoft Azure technologies that you should use to ingest the data from the IoT devices into a data warehouse are:
Azure Stream Analytics
Azure Data Factory
Azure Stream Analytics is a real-time analytics platform that can process large volumes of streaming data from IoT devices. It can filter, aggregate, and transform the data before storing it in a data warehouse.
Azure Data Factory is a cloud-based ETL tool that can be used to orchestrate the data ingestion process. It can schedule data flows, transform data, and load it into a data warehouse.
By using these two technologies together, you can effectively ingest the data from the IoT devices into a data warehouse.
Unattempted
The two Microsoft Azure technologies that you should use to ingest the data from the IoT devices into a data warehouse are:
Azure Stream Analytics
Azure Data Factory
Azure Stream Analytics is a real-time analytics platform that can process large volumes of streaming data from IoT devices. It can filter, aggregate, and transform the data before storing it in a data warehouse.
Azure Data Factory is a cloud-based ETL tool that can be used to orchestrate the data ingestion process. It can schedule data flows, transform data, and load it into a data warehouse.
By using these two technologies together, you can effectively ingest the data from the IoT devices into a data warehouse.
Question 15 of 36
15. Question
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have Azure IoT Edge devices that generate streaming data.
On the devices, you need to detect anomalies in the data by using Azure Machine Learning models. Once an anomaly is detected, the devices must add information about the anomaly to the Azure IoT Hub stream.
Solution: You deploy Azure Functions as an IoT Edge module.
Does this meet the goal?
Correct
Explanation: Instead use Azure Stream Analytics and REST API. Note. Available in both the cloud and Azure IoT Edge, Azure Stream Analytics offers built-in machine learning based anomaly detection capabilities that can be used to monitor the two most commonly occurring anomalies: temporary and persistent. Stream Analytics supports user-defined functions, via REST API, that call out to Azure Machine Learning endpoints. References: https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-machine-learning-anomaly-detection
Incorrect
Explanation: Instead use Azure Stream Analytics and REST API. Note. Available in both the cloud and Azure IoT Edge, Azure Stream Analytics offers built-in machine learning based anomaly detection capabilities that can be used to monitor the two most commonly occurring anomalies: temporary and persistent. Stream Analytics supports user-defined functions, via REST API, that call out to Azure Machine Learning endpoints. References: https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-machine-learning-anomaly-detection
Unattempted
Explanation: Instead use Azure Stream Analytics and REST API. Note. Available in both the cloud and Azure IoT Edge, Azure Stream Analytics offers built-in machine learning based anomaly detection capabilities that can be used to monitor the two most commonly occurring anomalies: temporary and persistent. Stream Analytics supports user-defined functions, via REST API, that call out to Azure Machine Learning endpoints. References: https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-machine-learning-anomaly-detection
Question 16 of 36
16. Question
You plan to design an application that will use data from Azure Data Lake and perform sentiment analysis by using Azure Machine Learning algorithms.
The developers of the application use a mix of Windows- and Linux-based environments. The developers contribute to shared GitHub repositories.
You need all the developers to use the same tool to develop the application.
What is the best tool to use? More than one answer choice may achieve the goal.
Correct
The best tools to use for developing an application that uses data from Azure Data Lake and performs sentiment analysis by using Azure Machine Learning algorithms are:
Azure Machine Learning Studio
Microsoft Visual Studio Code
Here’s a breakdown of why these tools are the best options:
Azure Machine Learning Studio:
Provides a drag-and-drop interface for building and training machine learning models.
Offers pre-built components for data preparation, feature engineering, and model evaluation.
Can be accessed from any web browser, making it accessible to developers on both Windows and Linux.
Microsoft Visual Studio Code:
A lightweight and versatile code editor that supports a wide range of programming languages.
Offers extensions for working with Azure Machine Learning, including tools for data preparation, model training, and deployment.
Can be used on both Windows and Linux, ensuring consistency across development environments.
While Microsoft Visual Studio is a powerful tool for developing .NET applications, it might not be the best choice for this specific scenario, as it’s primarily focused on Windows development. Azure Notebooks is a cloud-based Jupyter notebook environment that can be used for data analysis and machine learning, but it might not offer the same level of integration with Azure Machine Learning as Azure Machine Learning Studio and Visual Studio Code.
By using Azure Machine Learning Studio and Visual Studio Code, developers can leverage the strengths of both tools and ensure consistency across their development environments.
The best tools to use for developing an application that uses data from Azure Data Lake and performs sentiment analysis by using Azure Machine Learning algorithms are:
Azure Machine Learning Studio
Microsoft Visual Studio Code
Here’s a breakdown of why these tools are the best options:
Azure Machine Learning Studio:
Provides a drag-and-drop interface for building and training machine learning models.
Offers pre-built components for data preparation, feature engineering, and model evaluation.
Can be accessed from any web browser, making it accessible to developers on both Windows and Linux.
Microsoft Visual Studio Code:
A lightweight and versatile code editor that supports a wide range of programming languages.
Offers extensions for working with Azure Machine Learning, including tools for data preparation, model training, and deployment.
Can be used on both Windows and Linux, ensuring consistency across development environments.
While Microsoft Visual Studio is a powerful tool for developing .NET applications, it might not be the best choice for this specific scenario, as it’s primarily focused on Windows development. Azure Notebooks is a cloud-based Jupyter notebook environment that can be used for data analysis and machine learning, but it might not offer the same level of integration with Azure Machine Learning as Azure Machine Learning Studio and Visual Studio Code.
By using Azure Machine Learning Studio and Visual Studio Code, developers can leverage the strengths of both tools and ensure consistency across their development environments.
The best tools to use for developing an application that uses data from Azure Data Lake and performs sentiment analysis by using Azure Machine Learning algorithms are:
Azure Machine Learning Studio
Microsoft Visual Studio Code
Here’s a breakdown of why these tools are the best options:
Azure Machine Learning Studio:
Provides a drag-and-drop interface for building and training machine learning models.
Offers pre-built components for data preparation, feature engineering, and model evaluation.
Can be accessed from any web browser, making it accessible to developers on both Windows and Linux.
Microsoft Visual Studio Code:
A lightweight and versatile code editor that supports a wide range of programming languages.
Offers extensions for working with Azure Machine Learning, including tools for data preparation, model training, and deployment.
Can be used on both Windows and Linux, ensuring consistency across development environments.
While Microsoft Visual Studio is a powerful tool for developing .NET applications, it might not be the best choice for this specific scenario, as it’s primarily focused on Windows development. Azure Notebooks is a cloud-based Jupyter notebook environment that can be used for data analysis and machine learning, but it might not offer the same level of integration with Azure Machine Learning as Azure Machine Learning Studio and Visual Studio Code.
By using Azure Machine Learning Studio and Visual Studio Code, developers can leverage the strengths of both tools and ensure consistency across their development environments.
You are designing an AI workflow that will aggregate data stored in Azure as JSON documents.
You expect to store more than 2 TB of new data daily.
You need to choose the data storage service for the data. The solution must minimize costs.
Which data storage service should you choose?
Correct
The best choice for storing large amounts of JSON data in Azure while minimizing costs is Azure Data Lake Storage Gen2.
Here’s why:
Scalability: Data Lake Storage Gen2 is designed to handle massive datasets and can easily scale to accommodate your expected daily data influx of over 2 TB.
Cost-effective: It offers a pay-as-you-go pricing model and provides optimized storage options for large datasets, making it a cost-effective choice.
JSON support: Data Lake Storage Gen2 supports storing data in various formats, including JSON, making it suitable for your use case.
Performance: It is designed for high-performance analytics and can handle large-scale data processing efficiently.
Azure Blob Storage, while also a viable option, might not be as cost-effective or optimized for large-scale data analytics as Data Lake Storage Gen2. Azure File Storage is primarily designed for file sharing and might not be suitable for storing large amounts of unstructured data. Azure Managed Disks are designed for storing data for virtual machines and might not be the most efficient choice for storing large amounts of unstructured JSON data.
Incorrect
The best choice for storing large amounts of JSON data in Azure while minimizing costs is Azure Data Lake Storage Gen2.
Here’s why:
Scalability: Data Lake Storage Gen2 is designed to handle massive datasets and can easily scale to accommodate your expected daily data influx of over 2 TB.
Cost-effective: It offers a pay-as-you-go pricing model and provides optimized storage options for large datasets, making it a cost-effective choice.
JSON support: Data Lake Storage Gen2 supports storing data in various formats, including JSON, making it suitable for your use case.
Performance: It is designed for high-performance analytics and can handle large-scale data processing efficiently.
Azure Blob Storage, while also a viable option, might not be as cost-effective or optimized for large-scale data analytics as Data Lake Storage Gen2. Azure File Storage is primarily designed for file sharing and might not be suitable for storing large amounts of unstructured data. Azure Managed Disks are designed for storing data for virtual machines and might not be the most efficient choice for storing large amounts of unstructured JSON data.
Unattempted
The best choice for storing large amounts of JSON data in Azure while minimizing costs is Azure Data Lake Storage Gen2.
Here’s why:
Scalability: Data Lake Storage Gen2 is designed to handle massive datasets and can easily scale to accommodate your expected daily data influx of over 2 TB.
Cost-effective: It offers a pay-as-you-go pricing model and provides optimized storage options for large datasets, making it a cost-effective choice.
JSON support: Data Lake Storage Gen2 supports storing data in various formats, including JSON, making it suitable for your use case.
Performance: It is designed for high-performance analytics and can handle large-scale data processing efficiently.
Azure Blob Storage, while also a viable option, might not be as cost-effective or optimized for large-scale data analytics as Data Lake Storage Gen2. Azure File Storage is primarily designed for file sharing and might not be suitable for storing large amounts of unstructured data. Azure Managed Disks are designed for storing data for virtual machines and might not be the most efficient choice for storing large amounts of unstructured JSON data.
Question 18 of 36
18. Question
You create an Azure Cognitive Services resource.
You develop needs to be able to retrieve the keys used by the resource. The solution must use the principle of least privilege.
What is the best role to assign to the developer? More than one answer choice may achieve the goal
You have Azure IoT Edge devices that generate measurement data from temperature sensors. The data changes very slowly.
You need to analyze the data in a temporal two-minute window. If the temperature rises five degrees above a limit, an alert must be raised. The solution must minimize the development of custom code.
What should you use?
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You create several AI models in Azure Machine Learning Studio.
You deploy the models to a production environment.
You need to monitor the compute performance of the models.
Solution: You enable AppInsights diagnostics.
Does this meet the goal?
Correct
No.
While Application Insights can provide valuable insights into the performance of your Azure Machine Learning models, it is not specifically designed to monitor the compute performance of deployed models. To effectively monitor the compute performance of your models, you would need to use additional tools or metrics that are specifically tailored for this purpose.
Here are some potential methods for monitoring the compute performance of your deployed models:
Azure Monitor: Azure Monitor provides a comprehensive platform for monitoring Azure resources, including machine learning models. It can collect metrics such as CPU usage, memory consumption, and network traffic to help you assess the compute performance of your models.
Custom metrics: You can create custom metrics to track specific aspects of your model’s performance, such as inference time or resource utilization. These metrics can be visualized and analyzed in Azure Monitor.
Performance testing: Regularly conduct performance tests to measure the response time, throughput, and resource consumption of your models under different workloads.
Feedback from users: Gather feedback from users to understand how well your models are performing in real-world scenarios.
By combining these methods, you can gain a comprehensive understanding of your models’ compute performance and identify areas for optimization.
Incorrect
No.
While Application Insights can provide valuable insights into the performance of your Azure Machine Learning models, it is not specifically designed to monitor the compute performance of deployed models. To effectively monitor the compute performance of your models, you would need to use additional tools or metrics that are specifically tailored for this purpose.
Here are some potential methods for monitoring the compute performance of your deployed models:
Azure Monitor: Azure Monitor provides a comprehensive platform for monitoring Azure resources, including machine learning models. It can collect metrics such as CPU usage, memory consumption, and network traffic to help you assess the compute performance of your models.
Custom metrics: You can create custom metrics to track specific aspects of your model’s performance, such as inference time or resource utilization. These metrics can be visualized and analyzed in Azure Monitor.
Performance testing: Regularly conduct performance tests to measure the response time, throughput, and resource consumption of your models under different workloads.
Feedback from users: Gather feedback from users to understand how well your models are performing in real-world scenarios.
By combining these methods, you can gain a comprehensive understanding of your models’ compute performance and identify areas for optimization.
Unattempted
No.
While Application Insights can provide valuable insights into the performance of your Azure Machine Learning models, it is not specifically designed to monitor the compute performance of deployed models. To effectively monitor the compute performance of your models, you would need to use additional tools or metrics that are specifically tailored for this purpose.
Here are some potential methods for monitoring the compute performance of your deployed models:
Azure Monitor: Azure Monitor provides a comprehensive platform for monitoring Azure resources, including machine learning models. It can collect metrics such as CPU usage, memory consumption, and network traffic to help you assess the compute performance of your models.
Custom metrics: You can create custom metrics to track specific aspects of your model’s performance, such as inference time or resource utilization. These metrics can be visualized and analyzed in Azure Monitor.
Performance testing: Regularly conduct performance tests to measure the response time, throughput, and resource consumption of your models under different workloads.
Feedback from users: Gather feedback from users to understand how well your models are performing in real-world scenarios.
By combining these methods, you can gain a comprehensive understanding of your models’ compute performance and identify areas for optimization.
Question 21 of 36
21. Question
You need to build an API pipeline that analyzes streaming data. The pipeline will perform the following:
Visual text recognition
Audio transcription
Sentiment analysis
Face detection
Which Azure Cognitive Services should you use in the pipeline?
Correct
Explanation: Azure Video Indexer is a cloud application built on Azure Media Analytics, Azure Search, Cognitive Services (such as the Face API, Microsoft Translator, the Computer Vision API, and Custom Speech Service). It enables you to extract the insights from your videos using Video Indexer video and audio models described below: Visual text recognition (OCR): Extracts text that is visually displayed in the video. Audio transcription: Converts speech to text in 12 languages and allows extensions. Sentiment analysis: Identifies positive, negative, and neutral sentiments from speech and visual text. Face detection: Detects and groups faces appearing in the video. References: https://docs.microsoft.com/en-us/azure/media-services/video-indexer/video-indexer-overview
Incorrect
Explanation: Azure Video Indexer is a cloud application built on Azure Media Analytics, Azure Search, Cognitive Services (such as the Face API, Microsoft Translator, the Computer Vision API, and Custom Speech Service). It enables you to extract the insights from your videos using Video Indexer video and audio models described below: Visual text recognition (OCR): Extracts text that is visually displayed in the video. Audio transcription: Converts speech to text in 12 languages and allows extensions. Sentiment analysis: Identifies positive, negative, and neutral sentiments from speech and visual text. Face detection: Detects and groups faces appearing in the video. References: https://docs.microsoft.com/en-us/azure/media-services/video-indexer/video-indexer-overview
Unattempted
Explanation: Azure Video Indexer is a cloud application built on Azure Media Analytics, Azure Search, Cognitive Services (such as the Face API, Microsoft Translator, the Computer Vision API, and Custom Speech Service). It enables you to extract the insights from your videos using Video Indexer video and audio models described below: Visual text recognition (OCR): Extracts text that is visually displayed in the video. Audio transcription: Converts speech to text in 12 languages and allows extensions. Sentiment analysis: Identifies positive, negative, and neutral sentiments from speech and visual text. Face detection: Detects and groups faces appearing in the video. References: https://docs.microsoft.com/en-us/azure/media-services/video-indexer/video-indexer-overview
Question 22 of 36
22. Question
You have thousands of images that contain text.
You need to process the text from the images to a machine-readable character stream.
Which Azure Cognitive Services service should you use?
DRAG DROP
You are designing a solution that uses drones to monitor remote locations for anomalies. The drones have Azure IoT Edge devices. The solution must meet the following requirements:
Email a user the picture and location of an anomaly when an anomaly is detected. Use a video stream to detect anomalies at the location. Send the pictures and location information to Azure. Use the latest amount of code possible.
You develop a custom vision Azure Machine Learning module to detect the anomalies.
Which service should you use for each requirement? To answer, drag the appropriate services to the correct requirements. Each service may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Select and Place:
Correct
Box 1: Azure IOT Edge Example You configure the Remote Monitoring solution to respond to anomalies detected by an IoT Edge device. IoT Edge devices let you process telemetry at the edge to reduce the volume of telemetry sent to the solution and to enable faster responses to events on devices. Box 2: Azure Functions Box 3: Azure Logic Apps References: https://docs.microsoft.com/en-us/azure/iot-accelerators/iot-accelerators-remotemonitoring-edge
Incorrect
Box 1: Azure IOT Edge Example You configure the Remote Monitoring solution to respond to anomalies detected by an IoT Edge device. IoT Edge devices let you process telemetry at the edge to reduce the volume of telemetry sent to the solution and to enable faster responses to events on devices. Box 2: Azure Functions Box 3: Azure Logic Apps References: https://docs.microsoft.com/en-us/azure/iot-accelerators/iot-accelerators-remotemonitoring-edge
Unattempted
Box 1: Azure IOT Edge Example You configure the Remote Monitoring solution to respond to anomalies detected by an IoT Edge device. IoT Edge devices let you process telemetry at the edge to reduce the volume of telemetry sent to the solution and to enable faster responses to events on devices. Box 2: Azure Functions Box 3: Azure Logic Apps References: https://docs.microsoft.com/en-us/azure/iot-accelerators/iot-accelerators-remotemonitoring-edge
Question 24 of 36
24. Question
You have several AI applications that use an Azure Kubernetes Service (AKS) cluster. The cluster supports a maximum of 32 nodes.
You discover that occasionally and unpredictably, the application requires more than 32 nodes.
You need to recommend a solution to handle the unpredictable application load.
Which scaling method should you recommend?
Correct
Explanation/Reference: Explanation: To keep up with application demands in Azure Kubernetes Service (AKS), you may need to adjust the number of nodes that run your workloads. The cluster autoscaler component can watch for pods in your cluster that can’t be scheduled because of resource constraints. When issues are detected, the number of nodes is increased to meet the application demand. Nodes are also regularly checked for a lack of running pods, with the number of nodes then decreased as needed. This ability to automatically scale up or down the number of nodes in your AKS cluster lets you run an efficient, cost-effective cluster. References: https://docs.microsoft.com/en-us/azure/aks/cluster-autoscaler
Incorrect
Explanation/Reference: Explanation: To keep up with application demands in Azure Kubernetes Service (AKS), you may need to adjust the number of nodes that run your workloads. The cluster autoscaler component can watch for pods in your cluster that can’t be scheduled because of resource constraints. When issues are detected, the number of nodes is increased to meet the application demand. Nodes are also regularly checked for a lack of running pods, with the number of nodes then decreased as needed. This ability to automatically scale up or down the number of nodes in your AKS cluster lets you run an efficient, cost-effective cluster. References: https://docs.microsoft.com/en-us/azure/aks/cluster-autoscaler
Unattempted
Explanation/Reference: Explanation: To keep up with application demands in Azure Kubernetes Service (AKS), you may need to adjust the number of nodes that run your workloads. The cluster autoscaler component can watch for pods in your cluster that can’t be scheduled because of resource constraints. When issues are detected, the number of nodes is increased to meet the application demand. Nodes are also regularly checked for a lack of running pods, with the number of nodes then decreased as needed. This ability to automatically scale up or down the number of nodes in your AKS cluster lets you run an efficient, cost-effective cluster. References: https://docs.microsoft.com/en-us/azure/aks/cluster-autoscaler
Question 25 of 36
25. Question
You have a database that contains sales data.
You plan to process the sales data by using two data streams named Stream1 and Stream2. Stream1 will be used for purchase order data. Stream2 will be used for reference data.
The reference data is stored in CSV files.
You need to recommend an ingestion solution for each data stream.
What two solutions should you recommend? Each correct answer is a complete solution.
NOTE: Each correct selection is worth one point.
Correct
Explanation: Stream1 – Azure Event Stream2 – Blob Storage Azure Event Hubs is a highly scalable data streaming platform and event ingestion service, capable of receiving and processing millions of events per second. Event Hubs can process and store events, data, or telemetry produced by distributed software and devices. Data sent to an event hub can be transformed and stored using any real-time analytics provider or batching/storage adapters. Event Hubs provides publish-subscribe capabilities with low latency at massive scale, which makes it appropriate for big data scenarios. Stream1, Stream2 – Blob Storage Stream Analytics has first-class integration with Azure data streams as inputs from three kinds of resources: Azure Event Hubs Azure IoT Hub Azure Blob storage These input resources can live in the same Azure subscription as your Stream Analytics job or a different subscription.References: https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/real-time-ingestion
Incorrect
Explanation: Stream1 – Azure Event Stream2 – Blob Storage Azure Event Hubs is a highly scalable data streaming platform and event ingestion service, capable of receiving and processing millions of events per second. Event Hubs can process and store events, data, or telemetry produced by distributed software and devices. Data sent to an event hub can be transformed and stored using any real-time analytics provider or batching/storage adapters. Event Hubs provides publish-subscribe capabilities with low latency at massive scale, which makes it appropriate for big data scenarios. Stream1, Stream2 – Blob Storage Stream Analytics has first-class integration with Azure data streams as inputs from three kinds of resources: Azure Event Hubs Azure IoT Hub Azure Blob storage These input resources can live in the same Azure subscription as your Stream Analytics job or a different subscription.References: https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/real-time-ingestion
Unattempted
Explanation: Stream1 – Azure Event Stream2 – Blob Storage Azure Event Hubs is a highly scalable data streaming platform and event ingestion service, capable of receiving and processing millions of events per second. Event Hubs can process and store events, data, or telemetry produced by distributed software and devices. Data sent to an event hub can be transformed and stored using any real-time analytics provider or batching/storage adapters. Event Hubs provides publish-subscribe capabilities with low latency at massive scale, which makes it appropriate for big data scenarios. Stream1, Stream2 – Blob Storage Stream Analytics has first-class integration with Azure data streams as inputs from three kinds of resources: Azure Event Hubs Azure IoT Hub Azure Blob storage These input resources can live in the same Azure subscription as your Stream Analytics job or a different subscription.References: https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/real-time-ingestion
Question 26 of 36
26. Question
Why should you add a tag after uploading images?
Correct
The tags tell the classifier which characteristics apply to an image. For the image classification to work, you do need to add at least one tag.
Incorrect
The tags tell the classifier which characteristics apply to an image. For the image classification to work, you do need to add at least one tag.
Unattempted
The tags tell the classifier which characteristics apply to an image. For the image classification to work, you do need to add at least one tag.
Question 27 of 36
27. Question
Your company recently deployed several hardware devices that contain sensors. The sensors generate new data on an hourly basis. The data generated is stored on-premises and retained for several years.
During the past two months, the sensors generated 300 GB of data.
You plan to move the data to Azure and then perform advanced analytics on the data.
You need to recommend an Azure storage solution for the data.
Which storage solution should you recommend?
You are designing an AI solution that will provide feedback to teachers who train students over the Internet. The students will be in classrooms located in remote areas. The solution will capture video and audio data of the students in the classrooms.
You need to recommend Azure Cognitive Services for the AI solution to meet the following requirements:
Alert teachers if a student seems angry or distracted. Identify each student in the classrooms for attendance purposes. Allow the teachers to log the text of conversations between themselves and the students.
Which Cognitive Services should you recommend?
Correct
Explanation: Azure Video Indexer is a cloud application built on Azure Media Analytics, Azure Search, Cognitive Services (such as the Face API, Microsoft Translator, the Computer Vision API, and Custom Speech Service). It enables you to extract the insights from your videos using Video Indexer video and audio models. Face API enables you to search, identify, and match faces in your private repository of up to 1 million people. The Face API now integrates emotion recognition, returning the confidence across a set of emotions for each face in the image such as anger, contempt, disgust, fear, happiness, neutral, sadness, and surprise. These emotions are understood to be cross-culturally and universally communicated with particular facial expressions. Speech-to-text from Azure Speech Services, also known as speech-to-text, enables real-time transcription of audio streams into text that your applications, tools, or devices can consume, display, and take action on as command input. This service is powered by the same recognition technology that Microsoft uses for Cortana and Office products, and works seamlessly with the translation and text-to-speech. Incorrect Answers: Computer Vision or the QnA is not required. References: https://docs.microsoft.com/en-us/azure/media-services/video-indexer/video-indexer-overviewhttps://azure.microsoft.com/en-us/services/cognitive-services/face/https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/speech-to-text
Incorrect
Explanation: Azure Video Indexer is a cloud application built on Azure Media Analytics, Azure Search, Cognitive Services (such as the Face API, Microsoft Translator, the Computer Vision API, and Custom Speech Service). It enables you to extract the insights from your videos using Video Indexer video and audio models. Face API enables you to search, identify, and match faces in your private repository of up to 1 million people. The Face API now integrates emotion recognition, returning the confidence across a set of emotions for each face in the image such as anger, contempt, disgust, fear, happiness, neutral, sadness, and surprise. These emotions are understood to be cross-culturally and universally communicated with particular facial expressions. Speech-to-text from Azure Speech Services, also known as speech-to-text, enables real-time transcription of audio streams into text that your applications, tools, or devices can consume, display, and take action on as command input. This service is powered by the same recognition technology that Microsoft uses for Cortana and Office products, and works seamlessly with the translation and text-to-speech. Incorrect Answers: Computer Vision or the QnA is not required. References: https://docs.microsoft.com/en-us/azure/media-services/video-indexer/video-indexer-overviewhttps://azure.microsoft.com/en-us/services/cognitive-services/face/https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/speech-to-text
Unattempted
Explanation: Azure Video Indexer is a cloud application built on Azure Media Analytics, Azure Search, Cognitive Services (such as the Face API, Microsoft Translator, the Computer Vision API, and Custom Speech Service). It enables you to extract the insights from your videos using Video Indexer video and audio models. Face API enables you to search, identify, and match faces in your private repository of up to 1 million people. The Face API now integrates emotion recognition, returning the confidence across a set of emotions for each face in the image such as anger, contempt, disgust, fear, happiness, neutral, sadness, and surprise. These emotions are understood to be cross-culturally and universally communicated with particular facial expressions. Speech-to-text from Azure Speech Services, also known as speech-to-text, enables real-time transcription of audio streams into text that your applications, tools, or devices can consume, display, and take action on as command input. This service is powered by the same recognition technology that Microsoft uses for Cortana and Office products, and works seamlessly with the translation and text-to-speech. Incorrect Answers: Computer Vision or the QnA is not required. References: https://docs.microsoft.com/en-us/azure/media-services/video-indexer/video-indexer-overviewhttps://azure.microsoft.com/en-us/services/cognitive-services/face/https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/speech-to-text
Question 29 of 36
29. Question
Your company has a data team of Transact-SQL experts.
You plan to ingest data from multiple sources into Azure Event Hubs.
You need to recommend which technology the data team should use to move and query data from Event Hubs to Azure Storage. The solution must leverage the data team’s existing skills.
What is the best recommendation to achieve the goal? More than one answer choice may achieve the goal.
Correct
Explanation/Reference: Explanation: Event Hubs Capture is the easiest way to automatically deliver streamed data in Event Hubs to an Azure Blob storage or Azure Data Lake store. You can subsequently process and deliver the data to any other storage destinations of your choice, such as SQL Data Warehouse or Cosmos DB. You to capture data from your event hub into a SQL data warehouse by using an Azure function triggered by an event grid.First, you create an event hub with the Capture feature enabled and set an Azure blob storage as the destination. Data generated by WindTurbineGenerator is streamed into the event hub and is automatically captured into Azure Storage as Avro files. Next, you create an Azure Event Grid subscription with the Event Hubs namespace as its source and the Azure Function endpoint as its destination. Whenever a new Avro file is delivered to the Azure Storage blob by the Event Hubs Capture feature, Event Grid notifies the Azure Function with the blob URI. The Function then migrates data from the blob to a SQL data warehouse. References: https://docs.microsoft.com/en-us/azure/event-hubs/store-captured-data-data-warehouse
Incorrect
Explanation/Reference: Explanation: Event Hubs Capture is the easiest way to automatically deliver streamed data in Event Hubs to an Azure Blob storage or Azure Data Lake store. You can subsequently process and deliver the data to any other storage destinations of your choice, such as SQL Data Warehouse or Cosmos DB. You to capture data from your event hub into a SQL data warehouse by using an Azure function triggered by an event grid.First, you create an event hub with the Capture feature enabled and set an Azure blob storage as the destination. Data generated by WindTurbineGenerator is streamed into the event hub and is automatically captured into Azure Storage as Avro files. Next, you create an Azure Event Grid subscription with the Event Hubs namespace as its source and the Azure Function endpoint as its destination. Whenever a new Avro file is delivered to the Azure Storage blob by the Event Hubs Capture feature, Event Grid notifies the Azure Function with the blob URI. The Function then migrates data from the blob to a SQL data warehouse. References: https://docs.microsoft.com/en-us/azure/event-hubs/store-captured-data-data-warehouse
Unattempted
Explanation/Reference: Explanation: Event Hubs Capture is the easiest way to automatically deliver streamed data in Event Hubs to an Azure Blob storage or Azure Data Lake store. You can subsequently process and deliver the data to any other storage destinations of your choice, such as SQL Data Warehouse or Cosmos DB. You to capture data from your event hub into a SQL data warehouse by using an Azure function triggered by an event grid.First, you create an event hub with the Capture feature enabled and set an Azure blob storage as the destination. Data generated by WindTurbineGenerator is streamed into the event hub and is automatically captured into Azure Storage as Avro files. Next, you create an Azure Event Grid subscription with the Event Hubs namespace as its source and the Azure Function endpoint as its destination. Whenever a new Avro file is delivered to the Azure Storage blob by the Event Hubs Capture feature, Event Grid notifies the Azure Function with the blob URI. The Function then migrates data from the blob to a SQL data warehouse. References: https://docs.microsoft.com/en-us/azure/event-hubs/store-captured-data-data-warehouse
Question 30 of 36
30. Question
You have an Azure Machine Learning model that is deployed to a web service.
You plan to publish the web service by using the name ml.contoso.com.
You need to recommend a solution to ensure that access to the web service is encrypted.
Which three actions should you recommend? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
Correct
Explanation/Reference: The process of securing a new web service or an existing one is as follows: 1.Get a domain name. 2.Get a digital certificate. 3.Deploy or update the web service with the SSL setting enabled. 4.Update your DNS to point to the web service. Note: To deploy (or re-deploy) the service with SSL enabled, set the ssl_enabled parameter to True, wherever applicable. Set the ssl_certificate parameter to the value of the certificate file and the ssl_key to the value of the key file. References: https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-secure-web-service
Incorrect
Explanation/Reference: The process of securing a new web service or an existing one is as follows: 1.Get a domain name. 2.Get a digital certificate. 3.Deploy or update the web service with the SSL setting enabled. 4.Update your DNS to point to the web service. Note: To deploy (or re-deploy) the service with SSL enabled, set the ssl_enabled parameter to True, wherever applicable. Set the ssl_certificate parameter to the value of the certificate file and the ssl_key to the value of the key file. References: https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-secure-web-service
Unattempted
Explanation/Reference: The process of securing a new web service or an existing one is as follows: 1.Get a domain name. 2.Get a digital certificate. 3.Deploy or update the web service with the SSL setting enabled. 4.Update your DNS to point to the web service. Note: To deploy (or re-deploy) the service with SSL enabled, set the ssl_enabled parameter to True, wherever applicable. Set the ssl_certificate parameter to the value of the certificate file and the ssl_key to the value of the key file. References: https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-secure-web-service
Question 31 of 36
31. Question
You are designing a solution that will use the Azure Content Moderator service to moderate user-generated content.
You need to moderate custom predefined content without repeatedly scanning the collected content.
Which API should you use?
Correct
Explanation: The default global list of terms in Azure Content Moderator is sufficient for most content moderation needs. However, you might need to screen for terms that are specific to your organization. For example, you might want to tag competitor names for further review. Use the List Management API to create custom lists of terms to use with the Text Moderation API. The Text – Screen operation scans your text for profanity, and also compares text against custom and shared blacklists. Incorrect AnswersB: Use the Text Moderation API in Azure Content Moderator to scan your text content. The operation scans your content for profanity, and compares the content against custom and shared blacklists. References: https://docs.microsoft.com/en-us/azure/cognitive-services/content-moderator/try-terms-list-api
Incorrect
Explanation: The default global list of terms in Azure Content Moderator is sufficient for most content moderation needs. However, you might need to screen for terms that are specific to your organization. For example, you might want to tag competitor names for further review. Use the List Management API to create custom lists of terms to use with the Text Moderation API. The Text – Screen operation scans your text for profanity, and also compares text against custom and shared blacklists. Incorrect AnswersB: Use the Text Moderation API in Azure Content Moderator to scan your text content. The operation scans your content for profanity, and compares the content against custom and shared blacklists. References: https://docs.microsoft.com/en-us/azure/cognitive-services/content-moderator/try-terms-list-api
Unattempted
Explanation: The default global list of terms in Azure Content Moderator is sufficient for most content moderation needs. However, you might need to screen for terms that are specific to your organization. For example, you might want to tag competitor names for further review. Use the List Management API to create custom lists of terms to use with the Text Moderation API. The Text – Screen operation scans your text for profanity, and also compares text against custom and shared blacklists. Incorrect AnswersB: Use the Text Moderation API in Azure Content Moderator to scan your text content. The operation scans your content for profanity, and compares the content against custom and shared blacklists. References: https://docs.microsoft.com/en-us/azure/cognitive-services/content-moderator/try-terms-list-api
Question 32 of 36
32. Question
You have a solution that runs on a five-node Azure Kubernetes Service (AKS) cluster. The cluster uses an N-series virtual machine.
An Azure Batch AI process runs once a day and rarely on demand.
You need to recommend a solution to maintain the cluster configuration when the cluster is not in use. The solution must not incur any compute costs.
What should you include in the recommendation?
Which of the following concepts isn’t part of the Cognitive Services Face API? (Choose one.)
Correct
Concepts (and methods) that are used in the Face API fall into the following categories: verification, detection, identification, similarity, and grouping. Manipulation of faces (morphing) is not part of the Face API.
Incorrect
Concepts (and methods) that are used in the Face API fall into the following categories: verification, detection, identification, similarity, and grouping. Manipulation of faces (morphing) is not part of the Face API.
Unattempted
Concepts (and methods) that are used in the Face API fall into the following categories: verification, detection, identification, similarity, and grouping. Manipulation of faces (morphing) is not part of the Face API.
Question 34 of 36
34. Question
You’re trying to set up a web service for your experiment, but the button isn’t available. Why?
Correct
If you haven’t run the experiment or if you come back to the experiment after you’ve signed out, you must run the experiment and get the green check marks that indicate satisfactory completion.
Incorrect
If you haven’t run the experiment or if you come back to the experiment after you’ve signed out, you must run the experiment and get the green check marks that indicate satisfactory completion.
Unattempted
If you haven’t run the experiment or if you come back to the experiment after you’ve signed out, you must run the experiment and get the green check marks that indicate satisfactory completion.
Question 35 of 36
35. Question
Which of the following is the correct direction setting on a queue binding definition in order to send messages to the queue?
Correct
When the message is output from the function to the queue, the direction is out.
Incorrect
When the message is output from the function to the queue, the direction is out.
Unattempted
When the message is output from the function to the queue, the direction is out.
Question 36 of 36
36. Question
You are developing a mobile application that will perform optical character recognition (OCR) from photos. The application will annotate the photos by using metadata, store the photos in Azure Blob storage, and then score the photos by using an Azure Machine Learning model. What should you use to process the data?
Correct
The most appropriate service to process the data for your mobile OCR application is:
Azure Functions.
Here’s why:
Azure Functions are serverless computing units that can be triggered by various events, including HTTP requests, timers, and messages from other Azure services.
In your case, you can create an Azure Function that is triggered by a photo upload from your mobile app.
The function can then perform the OCR, annotate the photo with metadata, and store it in Azure Blob storage.
Finally, the function can send the photo to an Azure Machine Learning model for scoring.
Azure Functions offer a flexible and scalable solution for processing data in response to events, making them ideal for scenarios like yours.
While Azure Stream Analytics, Azure Event Hubs, and Azure Logic Apps can also be used for data processing, they may not be as well-suited for this specific use case. Azure Stream Analytics is primarily designed for real-time data processing, Azure Event Hubs is used for ingesting and streaming data, and Azure Logic Apps are more suited for automating workflows and integrating different services.
Incorrect
The most appropriate service to process the data for your mobile OCR application is:
Azure Functions.
Here’s why:
Azure Functions are serverless computing units that can be triggered by various events, including HTTP requests, timers, and messages from other Azure services.
In your case, you can create an Azure Function that is triggered by a photo upload from your mobile app.
The function can then perform the OCR, annotate the photo with metadata, and store it in Azure Blob storage.
Finally, the function can send the photo to an Azure Machine Learning model for scoring.
Azure Functions offer a flexible and scalable solution for processing data in response to events, making them ideal for scenarios like yours.
While Azure Stream Analytics, Azure Event Hubs, and Azure Logic Apps can also be used for data processing, they may not be as well-suited for this specific use case. Azure Stream Analytics is primarily designed for real-time data processing, Azure Event Hubs is used for ingesting and streaming data, and Azure Logic Apps are more suited for automating workflows and integrating different services.
Unattempted
The most appropriate service to process the data for your mobile OCR application is:
Azure Functions.
Here’s why:
Azure Functions are serverless computing units that can be triggered by various events, including HTTP requests, timers, and messages from other Azure services.
In your case, you can create an Azure Function that is triggered by a photo upload from your mobile app.
The function can then perform the OCR, annotate the photo with metadata, and store it in Azure Blob storage.
Finally, the function can send the photo to an Azure Machine Learning model for scoring.
Azure Functions offer a flexible and scalable solution for processing data in response to events, making them ideal for scenarios like yours.
While Azure Stream Analytics, Azure Event Hubs, and Azure Logic Apps can also be used for data processing, they may not be as well-suited for this specific use case. Azure Stream Analytics is primarily designed for real-time data processing, Azure Event Hubs is used for ingesting and streaming data, and Azure Logic Apps are more suited for automating workflows and integrating different services.
X
Use Page numbers below to navigate to other practice tests