You have already completed the Test before. Hence you can not start it again.
Test is loading...
You must sign in or sign up to start the Test.
You have to finish following quiz, to start this Test:
Your results are here!! for" AZ-400 Practice Test 6 "
0 of 56 questions answered correctly
Your time:
Time has elapsed
Your Final Score is : 0
You have attempted : 0
Number of Correct Questions : 0 and scored 0
Number of Incorrect Questions : 0 and Negative marks 0
Average score
Your score
AZ-400
You have attempted: 0
Number of Correct Questions: 0 and scored 0
Number of Incorrect Questions: 0 and Negative marks 0
You can review your answers by clicking view questions. Important Note : Open Reference Documentation Links in New Tab (Right Click and Open in New Tab).
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
Answered
Review
Question 1 of 56
1. Question
You have an Azure DevOps project that contains a build pipeline. The build pipeline uses approximately 50 open source libraries.
You need to ensure that the project can be scanned for known security vulnerabilities in the open-source libraries.
What should you choose for Dropdown2?
Correct
Correct Answer(s):Â WhiteSource Bolt
WhiteSource Bolt is the CORRECT answer because our project consumes about 50 open source libraries and we need to scan our project for any known security vulnerabilities.
We will use a build task to add WhiteSource bold scan as a part of the current build pipeline. This will enable a continuous scan of all the libraries, whenever a build is run.
WhiteSource bolt helps us in continuous open source software security and compliance management using azure pipelines. https://azuredevopslabs.com/labs/vstsextend/whitesource/ https://whitesource.atlassian.net/wiki/spaces/WD/pages/33751265/
Bamboo is an INCORRECT answer because it is a continuous integration and project management tool which is similar to the Azure DevOps service. It can not conduct a security assessment scan for the codebase.
CMake is an INCORRECT answer because it can not run security vulnerability assessment scans on the codebase. It is just a continuous integration tool that can run and manage builds on an operating system. https://cmake.org/overview/
Chef is an INCORRECT answer because it is orchestration management and infrastructure automation tool and doesn’t conduct vulnerability assessment scans. https://docs.chef.io/chef_overview/
Incorrect
Correct Answer(s):Â WhiteSource Bolt
WhiteSource Bolt is the CORRECT answer because our project consumes about 50 open source libraries and we need to scan our project for any known security vulnerabilities.
We will use a build task to add WhiteSource bold scan as a part of the current build pipeline. This will enable a continuous scan of all the libraries, whenever a build is run.
WhiteSource bolt helps us in continuous open source software security and compliance management using azure pipelines. https://azuredevopslabs.com/labs/vstsextend/whitesource/ https://whitesource.atlassian.net/wiki/spaces/WD/pages/33751265/
Bamboo is an INCORRECT answer because it is a continuous integration and project management tool which is similar to the Azure DevOps service. It can not conduct a security assessment scan for the codebase.
CMake is an INCORRECT answer because it can not run security vulnerability assessment scans on the codebase. It is just a continuous integration tool that can run and manage builds on an operating system. https://cmake.org/overview/
Chef is an INCORRECT answer because it is orchestration management and infrastructure automation tool and doesn’t conduct vulnerability assessment scans. https://docs.chef.io/chef_overview/
Unattempted
Correct Answer(s):Â WhiteSource Bolt
WhiteSource Bolt is the CORRECT answer because our project consumes about 50 open source libraries and we need to scan our project for any known security vulnerabilities.
We will use a build task to add WhiteSource bold scan as a part of the current build pipeline. This will enable a continuous scan of all the libraries, whenever a build is run.
WhiteSource bolt helps us in continuous open source software security and compliance management using azure pipelines. https://azuredevopslabs.com/labs/vstsextend/whitesource/ https://whitesource.atlassian.net/wiki/spaces/WD/pages/33751265/
Bamboo is an INCORRECT answer because it is a continuous integration and project management tool which is similar to the Azure DevOps service. It can not conduct a security assessment scan for the codebase.
CMake is an INCORRECT answer because it can not run security vulnerability assessment scans on the codebase. It is just a continuous integration tool that can run and manage builds on an operating system. https://cmake.org/overview/
Chef is an INCORRECT answer because it is orchestration management and infrastructure automation tool and doesn’t conduct vulnerability assessment scans. https://docs.chef.io/chef_overview/
Question 2 of 56
2. Question
You have an Azure DevOps organization named Contoso, an Azure DevOps project named Project1, an Azure subscription named Sub1, and an Azure key vault named vault1.
You need to ensure that you can reference the values of the secrets stored in vault1 in all the pipelines of Project1. The solution must prevent the values from being stored in the pipelines.
What should you do?
Correct
Correct Answer(s):Â Create a variable group in Project1
Create a variable group in Project1 is the CORRECT answer because we want to use the values of some secrets (credentials) for all the pipelines of Project1, stored in a key vault in Azure. We use the variable groups from the library tab of an azure DevOps project and link a particular group to an azure key vault for synchronizing the secrets for use in all the pipelines of a particular project. We also get an option to select which secrets we want to synchronize with the Azure DevOps project, out of all the secrets that are protected inside the key vault. https://docs.microsoft.com/en-us/azure/devops/pipelines/library/variable-groups
Add a secure file to Project1 is an INCORRECT answer because secure files in a library are used to store certificates for different purposes like signing certificates, SSH certs, etc. However, they don’t help in fetching secrets from a key vault to make them available for all pipelines to use.
Modify the security settings of the pipelines is an INCORRECT answer because by doing this we will be able to change the access level security for all of our pipelines. However, they aren’t helpful in referencing the values of the secrets stored in a key vault.
Configure the security policy of Contoso is an INCORRECT answer because this will not help us in referencing the values of the secrets stored in a key vault as Organization settings don’t have such an option. Moreover, we want to use secrets from a key vault to be used in pipelines of a single project, not the whole organization.
Incorrect
Correct Answer(s):Â Create a variable group in Project1
Create a variable group in Project1 is the CORRECT answer because we want to use the values of some secrets (credentials) for all the pipelines of Project1, stored in a key vault in Azure. We use the variable groups from the library tab of an azure DevOps project and link a particular group to an azure key vault for synchronizing the secrets for use in all the pipelines of a particular project. We also get an option to select which secrets we want to synchronize with the Azure DevOps project, out of all the secrets that are protected inside the key vault. https://docs.microsoft.com/en-us/azure/devops/pipelines/library/variable-groups
Add a secure file to Project1 is an INCORRECT answer because secure files in a library are used to store certificates for different purposes like signing certificates, SSH certs, etc. However, they don’t help in fetching secrets from a key vault to make them available for all pipelines to use.
Modify the security settings of the pipelines is an INCORRECT answer because by doing this we will be able to change the access level security for all of our pipelines. However, they aren’t helpful in referencing the values of the secrets stored in a key vault.
Configure the security policy of Contoso is an INCORRECT answer because this will not help us in referencing the values of the secrets stored in a key vault as Organization settings don’t have such an option. Moreover, we want to use secrets from a key vault to be used in pipelines of a single project, not the whole organization.
Unattempted
Correct Answer(s):Â Create a variable group in Project1
Create a variable group in Project1 is the CORRECT answer because we want to use the values of some secrets (credentials) for all the pipelines of Project1, stored in a key vault in Azure. We use the variable groups from the library tab of an azure DevOps project and link a particular group to an azure key vault for synchronizing the secrets for use in all the pipelines of a particular project. We also get an option to select which secrets we want to synchronize with the Azure DevOps project, out of all the secrets that are protected inside the key vault. https://docs.microsoft.com/en-us/azure/devops/pipelines/library/variable-groups
Add a secure file to Project1 is an INCORRECT answer because secure files in a library are used to store certificates for different purposes like signing certificates, SSH certs, etc. However, they don’t help in fetching secrets from a key vault to make them available for all pipelines to use.
Modify the security settings of the pipelines is an INCORRECT answer because by doing this we will be able to change the access level security for all of our pipelines. However, they aren’t helpful in referencing the values of the secrets stored in a key vault.
Configure the security policy of Contoso is an INCORRECT answer because this will not help us in referencing the values of the secrets stored in a key vault as Organization settings don’t have such an option. Moreover, we want to use secrets from a key vault to be used in pipelines of a single project, not the whole organization.
Question 3 of 56
3. Question
You have a project in Azure DevOps named Project1. Project1 contains a build pipeline named Pipe1 that builds an application named App1.
You have an agent pool named Pool1 that contains a Windows Server 2019-based self-hosted agent. Pipe1 uses Pool1.
You plan to implement another project named Project2. Project2 will have a build pipeline named Pipe2 that builds an application named App2.
App1 and App2 have conflicting dependencies.Â
You need to minimize the possibility that the two build pipelines will conflict with each other.
The solution must minimize infrastructure costs.Â
What should you do?
Correct
Correct Answer(s):Â Create two container jobsÂ
Create two container jobs is the CORRECT answer because we have a requirement to minimize the possibility of two build pipelines conflicting with each other while keeping the infrastructure costs to a bare minimum.
A container job in azure YAML pipelines, helps us run the builds inside a container on top of the operating system of the host machine (Windows Server 2019-based self-hosted agent). This gives us more control over the dependencies for every build and helps in reducing the possibility of conflict between two build pipelines. We can isolate resources/dependencies for different builds while using the same infrastructure by leveraging virtual environments(containers) for multiple builds. https://docs.microsoft.com/en-us/azure/devops/pipelines/process/container-phases?view=azure-devops https://docs.microsoft.com/en-us/azure/devops/pipelines/process/phases?view=azure-devops&tabs=yaml
Add another self-hosted agent is an INCORRECT answer because this will not minimize infrastructure costs, but instead increase it.
Change the self-hosted agent to use Red Hat Enterprise Linux (RHEL) 8 is an INCORRECT answer because changing the operating system for the self-hosted machine will not help to isolate the dependencies and manage build conflicts.
Add a Docker Compose task to the build pipelines is an INCORRECT answer because Docker Compose helps us to faster deploy multiple container applications by using just one command to build the whole environment.Â
Incorrect
Correct Answer(s):Â Create two container jobsÂ
Create two container jobs is the CORRECT answer because we have a requirement to minimize the possibility of two build pipelines conflicting with each other while keeping the infrastructure costs to a bare minimum.
A container job in azure YAML pipelines, helps us run the builds inside a container on top of the operating system of the host machine (Windows Server 2019-based self-hosted agent). This gives us more control over the dependencies for every build and helps in reducing the possibility of conflict between two build pipelines. We can isolate resources/dependencies for different builds while using the same infrastructure by leveraging virtual environments(containers) for multiple builds. https://docs.microsoft.com/en-us/azure/devops/pipelines/process/container-phases?view=azure-devops https://docs.microsoft.com/en-us/azure/devops/pipelines/process/phases?view=azure-devops&tabs=yaml
Add another self-hosted agent is an INCORRECT answer because this will not minimize infrastructure costs, but instead increase it.
Change the self-hosted agent to use Red Hat Enterprise Linux (RHEL) 8 is an INCORRECT answer because changing the operating system for the self-hosted machine will not help to isolate the dependencies and manage build conflicts.
Add a Docker Compose task to the build pipelines is an INCORRECT answer because Docker Compose helps us to faster deploy multiple container applications by using just one command to build the whole environment.Â
Unattempted
Correct Answer(s):Â Create two container jobsÂ
Create two container jobs is the CORRECT answer because we have a requirement to minimize the possibility of two build pipelines conflicting with each other while keeping the infrastructure costs to a bare minimum.
A container job in azure YAML pipelines, helps us run the builds inside a container on top of the operating system of the host machine (Windows Server 2019-based self-hosted agent). This gives us more control over the dependencies for every build and helps in reducing the possibility of conflict between two build pipelines. We can isolate resources/dependencies for different builds while using the same infrastructure by leveraging virtual environments(containers) for multiple builds. https://docs.microsoft.com/en-us/azure/devops/pipelines/process/container-phases?view=azure-devops https://docs.microsoft.com/en-us/azure/devops/pipelines/process/phases?view=azure-devops&tabs=yaml
Add another self-hosted agent is an INCORRECT answer because this will not minimize infrastructure costs, but instead increase it.
Change the self-hosted agent to use Red Hat Enterprise Linux (RHEL) 8 is an INCORRECT answer because changing the operating system for the self-hosted machine will not help to isolate the dependencies and manage build conflicts.
Add a Docker Compose task to the build pipelines is an INCORRECT answer because Docker Compose helps us to faster deploy multiple container applications by using just one command to build the whole environment.Â
Question 4 of 56
4. Question
You are integrating Azure Pipelines and Microsoft Teams.
You install the Azure Pipelines app in Microsoft Teams.
You have an Azure DevOps organization named Contoso that contains a project named Project1.
You subscribe to Project1 in Microsoft Teams.
You need to ensure that you only receive events about failed builds in Microsoft Teams.
What should you do first?
Correct
The correct option to ensure you only receive notifications about failed builds in Microsoft Teams is:
D. From Microsoft teams, run @azure pipelines subscriptions
Here’s a breakdown of why this is the best approach and why the other options are not ideal:
Recommended Solution:
@azure pipelines subscriptions: This command, run within a Microsoft Teams channel where you’ve subscribed to your Azure DevOps project (Project1), allows you to manage your existing subscriptions and view the notification settings.
Notification Settings: Within the subscriptions list, you can find options to modify notification settings for each event type (e.g., build completion, deployment completion). You can choose to receive notifications for “only failures” for specific events like “Build completed”.
Why Other Options Are Not Ideal:
Publish Build Artifacts Task (A): This task (as mentioned in the previous explanations and links 1, 2, 3, 4, and 5) focuses on uploading build artifacts (output files) to a designated location. It doesn’t directly affect notification settings for failed builds in Teams.
Subscribe to Project (B): While subscribing to Project1 using @azure pipelines subscribe https://dev.azure.com/Contoso/Project1 in Teams is necessary for receiving notifications, it sets up a general subscription. You’ll receive notifications for all pipeline executions (successes, failures, etc.) unless you further configure them within Teams using subscriptions management (@azure pipelines subscriptions).
Enable Continuous Integration (C): Enabling continuous integration (CI) triggers the pipeline to run automatically upon code commits. While this might be your desired workflow, it doesn’t address filtering notifications for failed builds. CI ensures frequent builds, but you still need a way to identify and configure notifications for failures within Teams.
Additional Notes:
By using @azure pipelines subscriptions within Teams, you can not only filter for failed builds but also choose notification channels and customize message content.
The “Build quality summary” task, while not directly required for filtering failed build notifications, can still be a valuable addition to your pipeline. It provides a clear overview of the build status within the pipeline run details.
In conclusion, using @azure pipelines subscriptions in Microsoft Teams allows you to manage your existing subscriptions and configure notifications to receive alerts only for failed builds in your Azure DevOps project.
The correct option to ensure you only receive notifications about failed builds in Microsoft Teams is:
D. From Microsoft teams, run @azure pipelines subscriptions
Here’s a breakdown of why this is the best approach and why the other options are not ideal:
Recommended Solution:
@azure pipelines subscriptions: This command, run within a Microsoft Teams channel where you’ve subscribed to your Azure DevOps project (Project1), allows you to manage your existing subscriptions and view the notification settings.
Notification Settings: Within the subscriptions list, you can find options to modify notification settings for each event type (e.g., build completion, deployment completion). You can choose to receive notifications for “only failures” for specific events like “Build completed”.
Why Other Options Are Not Ideal:
Publish Build Artifacts Task (A): This task (as mentioned in the previous explanations and links 1, 2, 3, 4, and 5) focuses on uploading build artifacts (output files) to a designated location. It doesn’t directly affect notification settings for failed builds in Teams.
Subscribe to Project (B): While subscribing to Project1 using @azure pipelines subscribe https://dev.azure.com/Contoso/Project1 in Teams is necessary for receiving notifications, it sets up a general subscription. You’ll receive notifications for all pipeline executions (successes, failures, etc.) unless you further configure them within Teams using subscriptions management (@azure pipelines subscriptions).
Enable Continuous Integration (C): Enabling continuous integration (CI) triggers the pipeline to run automatically upon code commits. While this might be your desired workflow, it doesn’t address filtering notifications for failed builds. CI ensures frequent builds, but you still need a way to identify and configure notifications for failures within Teams.
Additional Notes:
By using @azure pipelines subscriptions within Teams, you can not only filter for failed builds but also choose notification channels and customize message content.
The “Build quality summary” task, while not directly required for filtering failed build notifications, can still be a valuable addition to your pipeline. It provides a clear overview of the build status within the pipeline run details.
In conclusion, using @azure pipelines subscriptions in Microsoft Teams allows you to manage your existing subscriptions and configure notifications to receive alerts only for failed builds in your Azure DevOps project.
The correct option to ensure you only receive notifications about failed builds in Microsoft Teams is:
D. From Microsoft teams, run @azure pipelines subscriptions
Here’s a breakdown of why this is the best approach and why the other options are not ideal:
Recommended Solution:
@azure pipelines subscriptions: This command, run within a Microsoft Teams channel where you’ve subscribed to your Azure DevOps project (Project1), allows you to manage your existing subscriptions and view the notification settings.
Notification Settings: Within the subscriptions list, you can find options to modify notification settings for each event type (e.g., build completion, deployment completion). You can choose to receive notifications for “only failures” for specific events like “Build completed”.
Why Other Options Are Not Ideal:
Publish Build Artifacts Task (A): This task (as mentioned in the previous explanations and links 1, 2, 3, 4, and 5) focuses on uploading build artifacts (output files) to a designated location. It doesn’t directly affect notification settings for failed builds in Teams.
Subscribe to Project (B): While subscribing to Project1 using @azure pipelines subscribe https://dev.azure.com/Contoso/Project1 in Teams is necessary for receiving notifications, it sets up a general subscription. You’ll receive notifications for all pipeline executions (successes, failures, etc.) unless you further configure them within Teams using subscriptions management (@azure pipelines subscriptions).
Enable Continuous Integration (C): Enabling continuous integration (CI) triggers the pipeline to run automatically upon code commits. While this might be your desired workflow, it doesn’t address filtering notifications for failed builds. CI ensures frequent builds, but you still need a way to identify and configure notifications for failures within Teams.
Additional Notes:
By using @azure pipelines subscriptions within Teams, you can not only filter for failed builds but also choose notification channels and customize message content.
The “Build quality summary” task, while not directly required for filtering failed build notifications, can still be a valuable addition to your pipeline. It provides a clear overview of the build status within the pipeline run details.
In conclusion, using @azure pipelines subscriptions in Microsoft Teams allows you to manage your existing subscriptions and configure notifications to receive alerts only for failed builds in your Azure DevOps project.
Your company has a project in Azure DevOps for a new web application.
The company identifies security as one of the highest priorities.
You need to recommend a solution to minimize the likelihood that infrastructure credentials will be leaked.
What should you recommend?
Add Azure Key Vault references to Azure Resource Manager templates is the CORRECT answer because our company focuses on a great level of security and we want to minimize the risk of exposing azure infrastructure credentials.
Azure Key Vault is a secure cloud storage place for secrets, certificates, keys, or any credentials. We can use the ARM template parameters file to refer to the values of key vault entities from the key vault securely without the credentials being exposed anywhere in the process. Please see the below snippet for an idea on how to refer to key vault secret from an ARM template parameters file:
“adminPassword”: {
    “reference”: {
        “keyVault”: {
        “id”: “/subscriptions//resourceGroups/mykeyvaultdeploymentrg/providers/Microsoft.KeyVault/vaults/”
        },
        “secretName”: “vmAdminPassword”
    }
},
Add a Run Inline Azure PowerShell task to the pipeline is an INCORRECT answer because this doesn’t make sure if our requirements are met or not. Even if we use PowerShell to deal with the credentials in a pipeline, they are in the scope of pipelines, which are less secure than the key vault. This is a vague answer and that is why we rule this option out.
Add a Azure Key Vault task to the pipeline is an INCORRECT answer because this task downloads the credentials(which are stored in a key vault) for a pipeline to use. This makes them more vulnerable to a leak than they are in a key vault.Â
Add a PowerShell task to the pipeline and run Set-AzureKeyVaultSecret is an INCORRECT answer because to use this command to create/update a secret value in a key vault, we first need to convert plain text password to secure string and then pass it as credential value to the ‘Set-AzureKeyVaultSecret’ command. Doing this will expose the credentials as plain text and that is why this is not a secure solution.
Add Azure Key Vault references to Azure Resource Manager templates is the CORRECT answer because our company focuses on a great level of security and we want to minimize the risk of exposing azure infrastructure credentials.
Azure Key Vault is a secure cloud storage place for secrets, certificates, keys, or any credentials. We can use the ARM template parameters file to refer to the values of key vault entities from the key vault securely without the credentials being exposed anywhere in the process. Please see the below snippet for an idea on how to refer to key vault secret from an ARM template parameters file:
“adminPassword”: {
    “reference”: {
        “keyVault”: {
        “id”: “/subscriptions//resourceGroups/mykeyvaultdeploymentrg/providers/Microsoft.KeyVault/vaults/”
        },
        “secretName”: “vmAdminPassword”
    }
},
Add a Run Inline Azure PowerShell task to the pipeline is an INCORRECT answer because this doesn’t make sure if our requirements are met or not. Even if we use PowerShell to deal with the credentials in a pipeline, they are in the scope of pipelines, which are less secure than the key vault. This is a vague answer and that is why we rule this option out.
Add a Azure Key Vault task to the pipeline is an INCORRECT answer because this task downloads the credentials(which are stored in a key vault) for a pipeline to use. This makes them more vulnerable to a leak than they are in a key vault.Â
Add a PowerShell task to the pipeline and run Set-AzureKeyVaultSecret is an INCORRECT answer because to use this command to create/update a secret value in a key vault, we first need to convert plain text password to secure string and then pass it as credential value to the ‘Set-AzureKeyVaultSecret’ command. Doing this will expose the credentials as plain text and that is why this is not a secure solution.
Add Azure Key Vault references to Azure Resource Manager templates is the CORRECT answer because our company focuses on a great level of security and we want to minimize the risk of exposing azure infrastructure credentials.
Azure Key Vault is a secure cloud storage place for secrets, certificates, keys, or any credentials. We can use the ARM template parameters file to refer to the values of key vault entities from the key vault securely without the credentials being exposed anywhere in the process. Please see the below snippet for an idea on how to refer to key vault secret from an ARM template parameters file:
“adminPassword”: {
    “reference”: {
        “keyVault”: {
        “id”: “/subscriptions//resourceGroups/mykeyvaultdeploymentrg/providers/Microsoft.KeyVault/vaults/”
        },
        “secretName”: “vmAdminPassword”
    }
},
Add a Run Inline Azure PowerShell task to the pipeline is an INCORRECT answer because this doesn’t make sure if our requirements are met or not. Even if we use PowerShell to deal with the credentials in a pipeline, they are in the scope of pipelines, which are less secure than the key vault. This is a vague answer and that is why we rule this option out.
Add a Azure Key Vault task to the pipeline is an INCORRECT answer because this task downloads the credentials(which are stored in a key vault) for a pipeline to use. This makes them more vulnerable to a leak than they are in a key vault.Â
Add a PowerShell task to the pipeline and run Set-AzureKeyVaultSecret is an INCORRECT answer because to use this command to create/update a secret value in a key vault, we first need to convert plain text password to secure string and then pass it as credential value to the ‘Set-AzureKeyVaultSecret’ command. Doing this will expose the credentials as plain text and that is why this is not a secure solution.
Question 6 of 56
6. Question
Note: This question is part of a series of questions that present the same scenario.
You need to recommend an integration strategy for the build process of a Java application. The solution must meet the following requirements:
– The builds must access an on-premises dependency management system.
– The build outputs must be stored as Server artifacts in Azure DevOps.
– The source code must be stored in a Git repository in Azure DevOps.
Solution: Configure an Octopus Tentacle on an on-premises machine. Use the Package Application task in the build pipeline.
Does this meet the goal?
Correct
Yes, this solution meets the goal.
Here’s how:
On-premises dependency access:
Octopus Deploy, with an on-premises Tentacle, can connect to your on-premises dependency management system (like an internal Maven repository) to retrieve the necessary dependencies for your Java build.
Build outputs as Server artifacts:
The “Package Application” task in Azure DevOps is designed to package your application (e.g., create a JAR or WAR file).
You can configure this task to publish the resulting package as a Server artifact within your Azure DevOps project. This makes the build output easily accessible for subsequent deployment steps or other pipelines.
Source code in Azure DevOps Git:
The solution assumes that your source code is already hosted in a Git repository within Azure DevOps, which is a common practice for projects managed within the Azure DevOps platform.
In summary:
This solution leverages Octopus Deploy, an established tool for release automation, to handle dependency management and deployment. By combining it with Azure DevOps build pipelines and the “Package Application” task, you can achieve the desired integration while maintaining control over your on-premises dependencies.
Octopus Deploy, with an on-premises Tentacle, can connect to your on-premises dependency management system (like an internal Maven repository) to retrieve the necessary dependencies for your Java build.
Build outputs as Server artifacts:
The “Package Application” task in Azure DevOps is designed to package your application (e.g., create a JAR or WAR file).
You can configure this task to publish the resulting package as a Server artifact within your Azure DevOps project. This makes the build output easily accessible for subsequent deployment steps or other pipelines.
Source code in Azure DevOps Git:
The solution assumes that your source code is already hosted in a Git repository within Azure DevOps, which is a common practice for projects managed within the Azure DevOps platform.
In summary:
This solution leverages Octopus Deploy, an established tool for release automation, to handle dependency management and deployment. By combining it with Azure DevOps build pipelines and the “Package Application” task, you can achieve the desired integration while maintaining control over your on-premises dependencies.
Octopus Deploy, with an on-premises Tentacle, can connect to your on-premises dependency management system (like an internal Maven repository) to retrieve the necessary dependencies for your Java build.
Build outputs as Server artifacts:
The “Package Application” task in Azure DevOps is designed to package your application (e.g., create a JAR or WAR file).
You can configure this task to publish the resulting package as a Server artifact within your Azure DevOps project. This makes the build output easily accessible for subsequent deployment steps or other pipelines.
Source code in Azure DevOps Git:
The solution assumes that your source code is already hosted in a Git repository within Azure DevOps, which is a common practice for projects managed within the Azure DevOps platform.
In summary:
This solution leverages Octopus Deploy, an established tool for release automation, to handle dependency management and deployment. By combining it with Azure DevOps build pipelines and the “Package Application” task, you can achieve the desired integration while maintaining control over your on-premises dependencies.
Note: This question is part of a series of questions that present the same scenario.
You need to recommend an integration strategy for the build process of a Java application. The solution must meet the following requirements:
– The builds must access an on-premises dependency management system.
– The build outputs must be stored as Server artifacts in Azure DevOps.
– The source code must be stored in a Git repository in Azure DevOps.
Solution: Install and configure a self-hosted build agent on an on-premises machine. Configure the build pipeline to use the Default agent pool. Include the Java Tool Installer task in the build pipeline.
Does this meet the goal?
Correct
No, this solution does not fully meet the goal.
Here’s why:
Dependency Management: While the self-hosted agent can run on-premises and potentially access the on-premises dependency management system, this solution does not explicitly address how to effectively manage and retrieve dependencies from that system during the build process.
Build Outputs as Server Artifacts:
Using a self-hosted agent allows you to execute build steps on-premises.
However, to ensure build outputs (like JAR files) are stored as Server artifacts in Azure DevOps, you would need to configure the build pipeline to publish these artifacts back to Azure DevOps after the build completes on the self-hosted agent.
Source Code in Azure DevOps Git:
The solution assumes that the source code is already in Azure DevOps Git, which is a key requirement.
To improve this solution:
Dependency Management:
Incorporate steps in the build pipeline to authenticate with and download dependencies from the on-premises dependency management system.
Consider using tools like Maven or Gradle, which are well-suited for managing Java dependencies.
Artifact Publishing:
Explicitly include a task in the build pipeline (like the “Publish Build Artifacts” task) to upload the generated build output (JAR file) to the Azure DevOps Server as an artifact.
In summary:
While the use of a self-hosted agent provides flexibility, this solution needs enhancements to properly handle dependency management and ensure that the build outputs are effectively published as artifacts within Azure DevOps.
Incorrect
No, this solution does not fully meet the goal.
Here’s why:
Dependency Management: While the self-hosted agent can run on-premises and potentially access the on-premises dependency management system, this solution does not explicitly address how to effectively manage and retrieve dependencies from that system during the build process.
Build Outputs as Server Artifacts:
Using a self-hosted agent allows you to execute build steps on-premises.
However, to ensure build outputs (like JAR files) are stored as Server artifacts in Azure DevOps, you would need to configure the build pipeline to publish these artifacts back to Azure DevOps after the build completes on the self-hosted agent.
Source Code in Azure DevOps Git:
The solution assumes that the source code is already in Azure DevOps Git, which is a key requirement.
To improve this solution:
Dependency Management:
Incorporate steps in the build pipeline to authenticate with and download dependencies from the on-premises dependency management system.
Consider using tools like Maven or Gradle, which are well-suited for managing Java dependencies.
Artifact Publishing:
Explicitly include a task in the build pipeline (like the “Publish Build Artifacts” task) to upload the generated build output (JAR file) to the Azure DevOps Server as an artifact.
In summary:
While the use of a self-hosted agent provides flexibility, this solution needs enhancements to properly handle dependency management and ensure that the build outputs are effectively published as artifacts within Azure DevOps.
Unattempted
No, this solution does not fully meet the goal.
Here’s why:
Dependency Management: While the self-hosted agent can run on-premises and potentially access the on-premises dependency management system, this solution does not explicitly address how to effectively manage and retrieve dependencies from that system during the build process.
Build Outputs as Server Artifacts:
Using a self-hosted agent allows you to execute build steps on-premises.
However, to ensure build outputs (like JAR files) are stored as Server artifacts in Azure DevOps, you would need to configure the build pipeline to publish these artifacts back to Azure DevOps after the build completes on the self-hosted agent.
Source Code in Azure DevOps Git:
The solution assumes that the source code is already in Azure DevOps Git, which is a key requirement.
To improve this solution:
Dependency Management:
Incorporate steps in the build pipeline to authenticate with and download dependencies from the on-premises dependency management system.
Consider using tools like Maven or Gradle, which are well-suited for managing Java dependencies.
Artifact Publishing:
Explicitly include a task in the build pipeline (like the “Publish Build Artifacts” task) to upload the generated build output (JAR file) to the Azure DevOps Server as an artifact.
In summary:
While the use of a self-hosted agent provides flexibility, this solution needs enhancements to properly handle dependency management and ensure that the build outputs are effectively published as artifacts within Azure DevOps.
Question 8 of 56
8. Question
A team currently makes use of Docker containers for building their application. The application lifecycle also makes use of Azure Devops. Exploits need to be detected in the Docker images before the container can be used. These exploits must be detected as early on in the lifecycle as possible.
Which of the following would you configure as part of the lifecycle?
Correct
ItÂ’s always good to have an on-going process to analyse the images in the image registry itself. There are some points also given in a whitepaper which relates to security for containers
ItÂ’s always good to have an on-going process to analyse the images in the image registry itself. There are some points also given in a whitepaper which relates to security for containers
ItÂ’s always good to have an on-going process to analyse the images in the image registry itself. There are some points also given in a whitepaper which relates to security for containers
A team has the following DockerFile that will create an image
FROM windowsservercore
RUN powershell.exe -Command Invoke-WebRequest “https://www.python.org/ftp/python/3.5.1/python-3.5.1.exe” -OutFile c:\python-3.5.1.exe
RUN powershell.exe -Command Start-Process c:\python-3.5.1.exe -ArgumentList ‘/quiet InstallAllUsers=1 PrependPath=1’ -Wait
RUN powershell.exe -Command Remove-Item c:\python-3.5.1.exe -Force
You need to ensure that you optimize the DockerFile. Which of the following can you do to create an optimized DockerFile?
Correct
An example of such an optimization is given in the Microsoft documentation. You can just have one RUN command in the DockerFile.
A project team is using Azure Devops for building and deploying projects using pipelines. The application using this infrastructure is a Java based application. You need to ensure a strategy is in place for managing technical debt.
Which of the following would you recommend? Choose 2 answers from the options given below
Correct
SonarQube is the perfect tool that can be used for measuring technical debt. The Microsoft
documentation mentions the following
Then ensure to create a pre-deployment approval task so that the approver can view the technical
debt before proceeding with the approval
Option A is incorrect because Azure DevTest Labs cannot provide information on technical debt
Option C is incorrect because you need to ensure the reviewer can review the technical debt first
For more information on using SonarQube with Azure Devops, please visit the below URL https://docs.microsoft.com/en-us/azure/devops/java/sonarqube?view=azure-devops
Incorrect
SonarQube is the perfect tool that can be used for measuring technical debt. The Microsoft
documentation mentions the following
Then ensure to create a pre-deployment approval task so that the approver can view the technical
debt before proceeding with the approval
Option A is incorrect because Azure DevTest Labs cannot provide information on technical debt
Option C is incorrect because you need to ensure the reviewer can review the technical debt first
For more information on using SonarQube with Azure Devops, please visit the below URL https://docs.microsoft.com/en-us/azure/devops/java/sonarqube?view=azure-devops
Unattempted
SonarQube is the perfect tool that can be used for measuring technical debt. The Microsoft
documentation mentions the following
Then ensure to create a pre-deployment approval task so that the approver can view the technical
debt before proceeding with the approval
Option A is incorrect because Azure DevTest Labs cannot provide information on technical debt
Option C is incorrect because you need to ensure the reviewer can review the technical debt first
For more information on using SonarQube with Azure Devops, please visit the below URL https://docs.microsoft.com/en-us/azure/devops/java/sonarqube?view=azure-devops
Question 11 of 56
11. Question
A team currently has the source code repository defined in Github. They want to now migrate their code onto Azure Devops. Which of the following step could be used to clone the repository from Github to Azure Devops?
Correct
The process for this is mentioned in the Microsoft documentation as importing from the Git repository.
A company is currently planning on setting up Jenkins on an Azure virtual machine. Code will be build using the Jenkins server and then deployed to a Kubernetes cluster in Azure. The code will be picked up from the Azure container registry.
Which of the following needs to be implemented to ensure traffic can flow into the Jenkins instance on the Azure virtual machine
Correct
You need to ensure port 8080 is open on the virtual machine. This is mentioned in the Microsoft documentation
A company currently uses ServiceNow for Incident and Change Management. Most of their web-based applications which are developed in-house are hosted in Azure. The company needs to ensure that whenever there is an issue in the application a ticket is generated. Which of the following can help achieve this?
Correct
This can be done with the help of the IT Service Management connector in Azure Log Analytics. The Microsoft documentation mentions the following
A team is developing an application that is based on the .Net core framework. The application will connect to a Microsoft SQL Server database. During the development stage the application will be developed using on-premise servers. For the production environment, the application will be moved to Azure and use the Azure Web App Service.
During the production stage, where should you store the database connection settings?
Correct
You should place this in the connection strings setting in the Azure Web App. This is also mentioned in the Microsoft documentation
Option A is incorrect since this is not the recommended place to keep the database connecting string settings
Option C is incorrect since this is used when the application needs to authenticate using external identity provider
Option D is incorrect since this is used to authenticate to other resources in Azure
For more information on configuring the App Service, please visit the below URL https://docs.microsoft.com/en-us/azure/app-service/configure-common
Incorrect
You should place this in the connection strings setting in the Azure Web App. This is also mentioned in the Microsoft documentation
Option A is incorrect since this is not the recommended place to keep the database connecting string settings
Option C is incorrect since this is used when the application needs to authenticate using external identity provider
Option D is incorrect since this is used to authenticate to other resources in Azure
For more information on configuring the App Service, please visit the below URL https://docs.microsoft.com/en-us/azure/app-service/configure-common
Unattempted
You should place this in the connection strings setting in the Azure Web App. This is also mentioned in the Microsoft documentation
Option A is incorrect since this is not the recommended place to keep the database connecting string settings
Option C is incorrect since this is used when the application needs to authenticate using external identity provider
Option D is incorrect since this is used to authenticate to other resources in Azure
For more information on configuring the App Service, please visit the below URL https://docs.microsoft.com/en-us/azure/app-service/configure-common
Question 15 of 56
15. Question
A company wants to implement a package management solution for their Node.js applications. They want to ensure that developers can use their IDE to connect to the repository securely.
Which of the following would contain the credentials to connect to the package management solution?
A company is currently planning on using the Azure Devops service for managing the CI/CD pipeline for various applications. The team wants to have an effective communication tool that can be used across the project. The tool should integrate with Azure Devops and also have a separation of channels for each team.
You decide to implement Bamboo?
Does this fulfil the requirement?
A team is currently using Azure Devops for a Java based project. They need to use a static code analysis tool for the java project. Which of the following are tools that can be used along with Azure Devops for this purpose?
Correct
You can use tools such as PMD and FindBugs along with Azure Devops for static code analysis. The Microsoft documentation mentions the following
Since this is clearly mentioned in the documentation, all other options are incorrect
For more information on the tools for Devops, please visit the below URL https://docs.microsoft.com/en-us/azure/devops/java/standalone-tools?view=azure-devops
Incorrect
You can use tools such as PMD and FindBugs along with Azure Devops for static code analysis. The Microsoft documentation mentions the following
Since this is clearly mentioned in the documentation, all other options are incorrect
For more information on the tools for Devops, please visit the below URL https://docs.microsoft.com/en-us/azure/devops/java/standalone-tools?view=azure-devops
Unattempted
You can use tools such as PMD and FindBugs along with Azure Devops for static code analysis. The Microsoft documentation mentions the following
Since this is clearly mentioned in the documentation, all other options are incorrect
For more information on the tools for Devops, please visit the below URL https://docs.microsoft.com/en-us/azure/devops/java/standalone-tools?view=azure-devops
Question 18 of 56
18. Question
A team wants to implement Azure Automation DSC for a set of servers. They have currently defined the following configuration
configuration TestConfig {
  Node WebServer {
     WindowsFeature IIS {
        Ensure              = ‘Present’
        Name                = ‘Web-Server’
        IncludeAllSubFeature = $true
     }
  }
}
To upload the configuration into your Automation account, which PowerShell cmdLet should we execute?
Correct
The next step is to import the configuration into Azure Automation. The Microsoft documentation mentions the following
Since this is clearly mentioned in the Microsoft documentation, all other options are incorrect
For more information on configuring desired state for servers, please visit the below URL https://docs.microsoft.com/en-us/azure/automation/tutorial-configure-servers-desired-state
Incorrect
The next step is to import the configuration into Azure Automation. The Microsoft documentation mentions the following
Since this is clearly mentioned in the Microsoft documentation, all other options are incorrect
For more information on configuring desired state for servers, please visit the below URL https://docs.microsoft.com/en-us/azure/automation/tutorial-configure-servers-desired-state
Unattempted
The next step is to import the configuration into Azure Automation. The Microsoft documentation mentions the following
Since this is clearly mentioned in the Microsoft documentation, all other options are incorrect
For more information on configuring desired state for servers, please visit the below URL https://docs.microsoft.com/en-us/azure/automation/tutorial-configure-servers-desired-state
Question 19 of 56
19. Question
A team needs to create a Kubernetes cluster using the Azure CLI. The Kubernetes cluster needs to have monitoring enabled.
You need to complete the following script for this purpose
Which of the following would go into Slot2?
Correct
This is clearly given in the Microsoft documentation
Your team currently has the following Devops environment defined
An Azure Devops deployment
A Jenkins server hosted on an Azure Virtual Machine
You need to ensure that a notification it sent to the Jenkins server whenever a developer commit a change to a branch in Azure Repos.
You decide to implement an email subscription as part of Azure Devops Notification
Would this fulfil the requirement?
Correct
The Notifications in Devops are used for sending internal notifications within the deployment. The Microsoft documentation mentions the following on Azure Devops Notification
The Notifications in Devops are used for sending internal notifications within the deployment. The Microsoft documentation mentions the following on Azure Devops Notification
The Notifications in Devops are used for sending internal notifications within the deployment. The Microsoft documentation mentions the following on Azure Devops Notification
Your team currently has the following Devops environment defined
An Azure Devops deployment
A Jenkins server hosted on an Azure Virtual Machine
You need to ensure that a notification it sent to the Jenkins server whenever a developer commit a change to a branch in Azure Repos.
You decide to implement a service hook.
Would this fulfil the requirement?
Correct
Yes, you can create a service hook from the Jenkins server to the Azure Devops repo. An example of this is given in the Microsoft documentation
Your team currently has the following Devops environment defined
An Azure Devops deployment
A Jenkins server hosted on an Azure Virtual Machine
You need to ensure that a notification it sent to the Jenkins server whenever a developer commit a change to a branch in Azure Repos.
You decide to implement a trigger in Azure pipelines.
Would this fulfil the requirement?
A team has the following DockerFile that will create an image
FROM windowsservercore
RUN powershell.exe -Command Invoke-WebRequest “https://www.python.org/ftp/python/3.5.1/python-3.5.1.exe” -OutFile c:\python-3.5.1.exe
RUN powershell.exe -Command Start-Process c:\python-3.5.1.exe -ArgumentList ‘/quiet InstallAllUsers=1 PrependPath=1’ -Wait
RUN powershell.exe -Command Remove-Item c:\python-3.5.1.exe -Force
You need to ensure that you optimize the DockerFile. Which of the following can you do to create an optimized DockerFile?
Correct
An example of such an optimization is given in the Microsoft documentation. You can just have one RUN command in the DockerFile.
Your team is automating the build process for a Java based application by using Azure Devops. The team needs to have code coverage in place and then ensure the outcomes are published to Azure Pipelines. Which of the following can be used for the code coverage?
Correct
Cobertura is a code coverage tool for Java. Below is a snippet of the documentation page for the tool.
It also has the ability to publish results to Azure Devops as mentioned in the Microsoft documentation
A team has an Azure Kubernetes cluster defined. The cluster is Role Based Access Control enabled. The team needs to use Azure Container instances as a hosted environment to run the containers in Azure Kubernetes.
Which of the following actions must you perform to accomplish this?
Correct
There is a blog article detailing the steps. Either ensure that you have a service principal or RBAC to ensure Azure container instances can create the instances in Azure Kubernetes.
Then create a YAML file for the deployment of the connector for Azure container instances
Then issue the kubectl command to execute the YAML file
For more information on an example blog article on deploying window container instances to Azure Kubernetes, please visit the below URL https://anthonychu.ca/post/windows-containers-aci-connector-kubernetes/
Incorrect
There is a blog article detailing the steps. Either ensure that you have a service principal or RBAC to ensure Azure container instances can create the instances in Azure Kubernetes.
Then create a YAML file for the deployment of the connector for Azure container instances
Then issue the kubectl command to execute the YAML file
For more information on an example blog article on deploying window container instances to Azure Kubernetes, please visit the below URL https://anthonychu.ca/post/windows-containers-aci-connector-kubernetes/
Unattempted
There is a blog article detailing the steps. Either ensure that you have a service principal or RBAC to ensure Azure container instances can create the instances in Azure Kubernetes.
Then create a YAML file for the deployment of the connector for Azure container instances
Then issue the kubectl command to execute the YAML file
For more information on an example blog article on deploying window container instances to Azure Kubernetes, please visit the below URL https://anthonychu.ca/post/windows-containers-aci-connector-kubernetes/
Question 26 of 56
26. Question
A team is planning on using Azure Automation for a set of Azure Virtual machines. They need to use Azure state configuration to manage the state of the virtual machines. Which of the following actions would need to be performed to ensure the state of the virtual machines are managed effectively?
Choose 5 answers from the options given below
Correct
The Microsoft documentation lists the various steps for onboarding machines onto Azure Automation state configuration
The first step is to import an existing configuration
2. The next step is to compile the configuration
3. The next step is to on-board the set of virtual machines onto Azure Automation state configuration
The Microsoft documentation lists the various steps for onboarding machines onto Azure Automation state configuration
The first step is to import an existing configuration
2. The next step is to compile the configuration
3. The next step is to on-board the set of virtual machines onto Azure Automation state configuration
The Microsoft documentation lists the various steps for onboarding machines onto Azure Automation state configuration
The first step is to import an existing configuration
2. The next step is to compile the configuration
3. The next step is to on-board the set of virtual machines onto Azure Automation state configuration
Your team is currently using Azure Devops to manage a production-based environment. The team now wants to use Azure Devops to deploy Azure virtual machines to a staging environment. Below are the key requirements
Ensure the cost of hosting Azure resources is minimized
Provide the ability to automatically provision virtual machines
Use a custom Azure Resource Manager template to provision the virtual machines
Which of the following would you use as the implementation step for this requirement?
Correct
You can use Azure Dev Test Labs into Azure Devops to manage the creation of virtual machines. The Microsoft documentation mentions the following
Options B and C are incorrect since you need to ideally manage this from within the Devops environment itself.
Option A is incorrect since you would not use tasks in the release pipeline to create new virtual machines
For more information on integrating DevTest Labs into Azure Devops, please visit the below URL https://docs.microsoft.com/en-us/azure/lab-services/devtest-lab-integrate-ci-cd-vsts
Incorrect
You can use Azure Dev Test Labs into Azure Devops to manage the creation of virtual machines. The Microsoft documentation mentions the following
Options B and C are incorrect since you need to ideally manage this from within the Devops environment itself.
Option A is incorrect since you would not use tasks in the release pipeline to create new virtual machines
For more information on integrating DevTest Labs into Azure Devops, please visit the below URL https://docs.microsoft.com/en-us/azure/lab-services/devtest-lab-integrate-ci-cd-vsts
Unattempted
You can use Azure Dev Test Labs into Azure Devops to manage the creation of virtual machines. The Microsoft documentation mentions the following
Options B and C are incorrect since you need to ideally manage this from within the Devops environment itself.
Option A is incorrect since you would not use tasks in the release pipeline to create new virtual machines
For more information on integrating DevTest Labs into Azure Devops, please visit the below URL https://docs.microsoft.com/en-us/azure/lab-services/devtest-lab-integrate-ci-cd-vsts
Question 28 of 56
28. Question
A team is using Azure Resource Manager templates in their Devops pipeline. The templates need to reference secrets in the Azure Key vault dynamically. You need to complete the below snippet of the Resource Manager template
Which of the following would go into Slot1?
Correct
An example of this is given in the Microsoft documentation on an ARM template which is used to reference secrets dynamically from an Azure Key vault service
An example of this is given in the Microsoft documentation on an ARM template which is used to reference secrets dynamically from an Azure Key vault service
An example of this is given in the Microsoft documentation on an ARM template which is used to reference secrets dynamically from an Azure Key vault service
A team is using Azure Resource Manager templates in their Devops pipeline. The templates need to reference secrets in the Azure Key vault dynamically. You need to complete the below snippet of the Resource Manager template
Which of the following would go into Slot2?
Correct
An example of this is given in the Microsoft documentation on an ARM template which is used to reference secrets dynamically from an Azure Key vault service
An example of this is given in the Microsoft documentation on an ARM template which is used to reference secrets dynamically from an Azure Key vault service
An example of this is given in the Microsoft documentation on an ARM template which is used to reference secrets dynamically from an Azure Key vault service
A team is currently working on a Java based project in an Azure Devops environment. A solution needs to be recommended to improve the quality of code within the Devops project.
Which of the following would you recommend?
Correct
PMD is a source code analyser tool. Below is what is mentioned on the PMD tool
In Azure Devops , you can enable this as shown in the Microsoft documentation below
A team is currently building and deploying an application using Azure services which includes the Devops service. You have to advise on the right security tool to use to during the following phase of the development lifecycle
After a pull request
During the continuous integration phase
During the continuous delivery phase
Which of the following would you use for the pull request phase?
Correct
During the coding phase it is always preferential to use static code analyser tools. The Microsoft documentation mentions the following
A team is currently building and deploying an application using Azure services which includes the Devops service. You have to advise on the right security tool to use to during the following phase of the development lifecycle
After a pull request
During the continuous integration phase
During the continuous delivery phase
Which of the following would you use for the continuous integration phase?
Correct
During the CI phase it is always preferential to use static code analyser tools. The Microsoft documentation mentions the following
A team is currently building and deploying an application using Azure services which includes the Devops service. You have to advise on the right security tool to use to during the following phase of the development lifecycle
After a pull request
During the continuous integration phase
During the continuous delivery phase
Which of the following would you use for the continuous delivery phase?
Correct
During the deployment phase we can use penetration testing tools. The Microsoft documentation mentions the following
A team is currently building and deploying an application using Azure services which includes the Devops service. The development team needs to have an efficient way to communicate with each other. The tool should have the following capabilities
Provide the ability to have different communication channels for each team
Have the ability to integrate with Azure Devops
Be able on a variety of platforms such as Windows 10, Mac OS , iOS and Android
Which of the following would you use for this purpose?
Correct
Slack is the perfect communication tool for this requirement. The following is provided on the tool in the Azure marketplace
A project team is using Azure Devops for building and deploying projects using pipelines. The application using this infrastructure is a Java based application. You need to ensure a strategy is in place for managing technical debt.
Which of the following would you recommend? Choose 2 answers from the options given below
Correct
SonarQube is the perfect tool that can be used for measuring technical debt. The Microsoft
documentation mentions the following
Then ensure to create a pre-deployment approval task so that the approver can view the technical
debt before proceeding with the approval
Option A is incorrect because Azure DevTest Labs cannot provide information on technical debt
Option C is incorrect because you need to ensure the reviewer can review the technical debt first
For more information on using SonarQube with Azure Devops, please visit the below URL https://docs.microsoft.com/en-us/azure/devops/java/sonarqube?view=azure-devops
Incorrect
SonarQube is the perfect tool that can be used for measuring technical debt. The Microsoft
documentation mentions the following
Then ensure to create a pre-deployment approval task so that the approver can view the technical
debt before proceeding with the approval
Option A is incorrect because Azure DevTest Labs cannot provide information on technical debt
Option C is incorrect because you need to ensure the reviewer can review the technical debt first
For more information on using SonarQube with Azure Devops, please visit the below URL https://docs.microsoft.com/en-us/azure/devops/java/sonarqube?view=azure-devops
Unattempted
SonarQube is the perfect tool that can be used for measuring technical debt. The Microsoft
documentation mentions the following
Then ensure to create a pre-deployment approval task so that the approver can view the technical
debt before proceeding with the approval
Option A is incorrect because Azure DevTest Labs cannot provide information on technical debt
Option C is incorrect because you need to ensure the reviewer can review the technical debt first
For more information on using SonarQube with Azure Devops, please visit the below URL https://docs.microsoft.com/en-us/azure/devops/java/sonarqube?view=azure-devops
Question 36 of 56
36. Question
A team needs to track project metrics in the dashboards available in Azure Devops
You need to recommend the right chart widgets to use for each of the following requirement
Ability to provide the elapsed time from the creation of work items to their completion
Ability to provide the elapsed time to complete work items once they are marked as active
Ability to provide the view on the work remaining
Which of the following chart widget would you use for the following requirement
“Ability to provide the elapsed time from the creation of work items to their completion”
Correct
You can use the Lead time chart for this. The Microsoft documentation mentions the following
A team needs to track project metrics in the dashboards available in Azure Devops
You need to recommend the right chart widgets to use for each of the following requirement
Ability to provide the elapsed time from the creation of work items to their completion
Ability to provide the elapsed time to complete work items once they are marked as active
Ability to provide the view on the work remaining
Which of the following chart widget would you use for the following requirement
“Ability to provide the elapsed time to complete work items once they are marked as active”
Correct
You can use the Cycle time chart for this. The Microsoft documentation mentions the following
A team needs to track project metrics in the dashboards available in Azure Devops
You need to recommend the right chart widgets to use for each of the following requirement
Ability to provide the elapsed time from the creation of work items to their completion
Ability to provide the elapsed time to complete work items once they are marked as active
Ability to provide the view on the work remaining
Which of the following chart widget would you use for the following requirement
“Ability to provide the view on the work remaining”
Correct
You can use the Burndown chart for this. The Microsoft documentation mentions the following
A company is currently using Team Foundation Server 2013. They want to now migrate to Azure Devops. Below are the key points that need to be observed for the migration
All dates for the Team Foundation Version Control changesets need to be preserved
All TFS artifacts need to be migrated
The migration effort should be minimized
Which of the following step needs to be performed on the Team Foundation Server?
Correct
In the whitepaper showcasing how to migrate from TFS to Azure Devops , there is a section which states that the TFS server needs to be migrated to the latest version. This would help ensure that the TFS schema is close to the one represented in Azure Devops services. Below is what is mentioned in the whitepaper
In the whitepaper showcasing how to migrate from TFS to Azure Devops , there is a section which states that the TFS server needs to be migrated to the latest version. This would help ensure that the TFS schema is close to the one represented in Azure Devops services. Below is what is mentioned in the whitepaper
In the whitepaper showcasing how to migrate from TFS to Azure Devops , there is a section which states that the TFS server needs to be migrated to the latest version. This would help ensure that the TFS schema is close to the one represented in Azure Devops services. Below is what is mentioned in the whitepaper
A company is currently using Team Foundation Server 2013. They want to now migrate to Azure Devops. Below are the key points that need to be observed for the migration
All dates for the Team Foundation Version Control changesets need to be preserved
All TFS artifacts need to be migrated
The migration effort should be minimized
Which of the following step needs to be carried out for the migration?
Correct
The TFS can be used to ensure integrity of all data that is copied to Azure Devops. This tool also has been recently branded as Data migration tool for Azure Devops. This is given as one of the options for the database migration
Option A is incorrect since even though this is the easiest option, it won’t meet the requirement of “All dates for the Team Foundation Version Control changesets need to be preserved”
Option C is incorrect since this is not an option for migration
Option D is incorrect since this would not be the easiest option for the migration
For more information on the migration options, please visit the below URL https://docs.microsoft.com/en-gb/azure/devops/migrate/migrate-from-tfs?view=azure-devops
Incorrect
The TFS can be used to ensure integrity of all data that is copied to Azure Devops. This tool also has been recently branded as Data migration tool for Azure Devops. This is given as one of the options for the database migration
Option A is incorrect since even though this is the easiest option, it won’t meet the requirement of “All dates for the Team Foundation Version Control changesets need to be preserved”
Option C is incorrect since this is not an option for migration
Option D is incorrect since this would not be the easiest option for the migration
For more information on the migration options, please visit the below URL https://docs.microsoft.com/en-gb/azure/devops/migrate/migrate-from-tfs?view=azure-devops
Unattempted
The TFS can be used to ensure integrity of all data that is copied to Azure Devops. This tool also has been recently branded as Data migration tool for Azure Devops. This is given as one of the options for the database migration
Option A is incorrect since even though this is the easiest option, it won’t meet the requirement of “All dates for the Team Foundation Version Control changesets need to be preserved”
Option C is incorrect since this is not an option for migration
Option D is incorrect since this would not be the easiest option for the migration
For more information on the migration options, please visit the below URL https://docs.microsoft.com/en-gb/azure/devops/migrate/migrate-from-tfs?view=azure-devops
Question 41 of 56
41. Question
A team currently makes use of Docker containers for building their application. The application lifecycle also makes use of Azure Devops. Exploits need to be detected in the Docker images before the container can be used. These exploits must be detected as early on in the lifecycle as possible.
Which of the following would you configure as part of the lifecycle?
Correct
ItÂ’s always good to have an on-going process to analyse the images in the image registry itself. There are some points also given in a whitepaper which relates to security for containers
ItÂ’s always good to have an on-going process to analyse the images in the image registry itself. There are some points also given in a whitepaper which relates to security for containers
ItÂ’s always good to have an on-going process to analyse the images in the image registry itself. There are some points also given in a whitepaper which relates to security for containers
A team is currently using Azure Devops for their production-based application. The application is currently using Azure Web Apps, Azure SQL database and Azure Functions. An email needs to be sent to the internal Devops team whenever the web application fails to return a status code of 200. Which of the following can be used to accomplish this?
Correct
An example of this is given in the Microsoft documentation which shows the use of availability tests for this sort of requirement.
A company is planning on deploying an application using Azure Pipelines to several virtual machines. Virtual machines will be hosted both in Azure and in their on-premise data center. All the virtual machines have the Azure Pipelines agent installed.
You need to recommend a strategy for ensuring the applications can be released to the various endpoints.
Which of the following would you use to deploy the application to the Virtual machines hosted in Azure?
Correct
No matter what the type of the endpoint is, you have to make use of deployment groups. The Microsoft documentation mentions the following
A company is planning on deploying an application using Azure Pipelines to several virtual machines. Virtual machines will be hosted both in Azure and in their on-premise data center. All the virtual machines have the Azure Pipelines agent installed.
You need to recommend a strategy for ensuring the applications can be released to the various endpoints.
Which of the following would you use to deploy the application to the Virtual machines hosted in the on-premise data center?
A team is planning on using Azure Devops for their applications. They need to configure access to the Azure Devops Agent pools. The access configuration must meet the following requirements
Ability to use a project agent pool when authoring build release pipelines
Provide the ability to view the agent pool and agents of the organization
Ensure the principle of least privilege is used
Which of the following role would you assign to the organization in Azure Devops?
Correct
You would assign the Reader role. This would fulfil the requirement of ” Provide the ability to view the agent pool and agents of the organization”. The following is mentioned about the various Agent pool security roles and the Reader role fits this requirement
You would assign the Reader role. This would fulfil the requirement of ” Provide the ability to view the agent pool and agents of the organization”. The following is mentioned about the various Agent pool security roles and the Reader role fits this requirement
You would assign the Reader role. This would fulfil the requirement of ” Provide the ability to view the agent pool and agents of the organization”. The following is mentioned about the various Agent pool security roles and the Reader role fits this requirement
A team is planning on using Azure Devops for their applications. They need to configure access to the Azure Devops Agent pools. The access configuration must meet the following requirements
Ability to use a project agent pool when authoring build release pipelines
Provide the ability to view the agent pool and agents of the organization
Ensure the principle of least privilege is used
Which of the following role would you assign to the project in Azure Devops?
Correct
You would assign the Service Account role. This would fulfil the requirement of ” Ability to use a project agent pool when authoring build release pipelines”. The following is mentioned about the various Agent pool security roles and the Reader role fits this requirement
You would assign the Service Account role. This would fulfil the requirement of ” Ability to use a project agent pool when authoring build release pipelines”. The following is mentioned about the various Agent pool security roles and the Reader role fits this requirement
You would assign the Service Account role. This would fulfil the requirement of ” Ability to use a project agent pool when authoring build release pipelines”. The following is mentioned about the various Agent pool security roles and the Reader role fits this requirement
A company has created a new project in Azure Devops for a web application. The company currently uses ServiceNow for change and release management. The company wants to ensure a change request is processed before any changes are released to production. Which of the following are 2 ways that ServiceNow can be integrated into the Azure Devops Release pipeline?
Correct
Azure Devops has direct integration with ServiceNow. You donÂ’t need to create a SOAP or REST API for this. The Microsoft documentation mentions the following
Azure Devops has direct integration with ServiceNow. You donÂ’t need to create a SOAP or REST API for this. The Microsoft documentation mentions the following
Azure Devops has direct integration with ServiceNow. You donÂ’t need to create a SOAP or REST API for this. The Microsoft documentation mentions the following
A company is planning on developing and deploying 4 applications. Each of the applications have different requirements when it comes to version control. The requirements are given below
ProjectA – Here the project leads must be able to restrict access to individual files and folders in the repository
ProjectB- This project needs to ensure that
The last build was successful before a check-in
Ensure that the check-in is associate with at least one work item
ProjectC – Here team members must be able to share code using Xcode
ProjectD – Here it needs to be ensured that the release branch is only viewable by project leads.
Which of the following source control system would you use for ProjectA?
Correct
This sort of requirement is possible with Team Foundation Version Control. The Microsoft documentation mentions the following
A company is planning on developing and deploying 4 applications. Each of the applications have different requirements when it comes to version control. The requirements are given below
ProjectA – Here the project leads must be able to restrict access to individual files and folders in the repository
ProjectB- This project needs to ensure that
The last build was successful before a check-in
Ensure that the check-in is associate with at least one work item
ProjectC – Here team members must be able to share code using Xcode
ProjectD – Here it needs to be ensured that the release branch is only viewable by project leads.
Which of the following source control system would you use for ProjectB?
Correct
This sort of requirement is possible with Team Foundation Version Control. The Microsoft documentation mentions the following
A company is planning on developing and deploying 4 applications. Each of the applications have different requirements when it comes to version control. The requirements are given below
ProjectA – Here the project leads must be able to restrict access to individual files and folders in the repository
ProjectB- This project needs to ensure that
The last build was successful before a check-in
Ensure that the check-in is associate with at least one work item
ProjectC – Here team members must be able to share code using Xcode
ProjectD – Here it needs to be ensured that the release branch is only viewable by project leads.
Which of the following source control system would you use for ProjectC?
Correct
This can be done from Git. There is also a guide in Microsoft documentation on the same.
A company is planning on building several Azure Devops pipelines. They need to use hosted build agents for various application types.
Which of the following build agent pool would you use to host an application that runs on iOS?
Correct
This can be done via the Hosted macOS agent. The Microsoft documentation mentions the following
A company is planning on building several Azure Devops pipelines. They need to use hosted build agents for various application types.
Which of the following build agent pool would you use to host an application that runs on Internet Information Services that runs in a Docker container?
Correct
This can be done via the Hosted Windows Container agent. The Microsoft documentation mentions the following
A company is planning on building several Azure Devops pipelines. They need to use hosted build agents for various application types.
Which of the following build agent pool would you use to host an application that runs on Linux?
Correct
This can be done via the Hosted Ubuntu agent. The Microsoft documentation mentions the following
A team is currently using a project in Azure Devops. The team needs to have a policy in place that ensures the following
A user should be able to merge to a master branch even if the code fails to compile.
The solution must use the principle of least privilege.
Which of the following would you implement?
Correct
You can set permissions at the branch level. This is also given in the Microsoft documentation
A team needs to deploy resources using Azure Resource Manager templates. You have to ensure that the users performing the deployment donÂ’t have the ability to view the connecting strings required by the application being deployed via the template. Which of the following would you use for this requirement?
Correct
The ideal service to use for this purpose is the Azure Key vault service. The Microsoft documentation mentions the following
A company has a set of teams working on mobile applications. They currently used App Center to build and test the applications. The following requirement needs to be met for the development environment
“Access to the applications builds must be managed at the organization level”
Which of the following would you set as the group to control access to the builds?
Correct
The correct answer is:
C. Azure Active Directory Groups
Azure Active Directory (Azure AD) groups provide the most flexible and scalable way to manage access to applications in Azure App Center. By using Azure AD groups, you can:
Centralize management: Azure AD groups can be managed from a central location, making it easier to control access to applications.
Grant access to multiple applications: You can add multiple applications to a single Azure AD group, making it easier to manage access for teams that need to work on multiple applications.
Use existing identity infrastructure: If your company already uses Azure AD for other purposes, you can leverage your existing identity infrastructure to manage access to Azure App Center applications.
Here’s a breakdown of the other options:
A. Active Directory Federation Groups: This option is only applicable if your company is using Active Directory Federation Services (ADFS) to federate with Azure AD.
B. Active Directory Groups: This option is only applicable if your company is using on-premises Active Directory.
D. Visual Studio App Center distribution Groups: This option is specific to Visual Studio App Center and does not provide the same level of flexibility and scalability as Azure AD groups.
Therefore, Azure AD groups are the best option for managing access to applications in Azure App Center at the organization level.
Azure Active Directory (Azure AD) groups provide the most flexible and scalable way to manage access to applications in Azure App Center. By using Azure AD groups, you can:
Centralize management: Azure AD groups can be managed from a central location, making it easier to control access to applications.
Grant access to multiple applications: You can add multiple applications to a single Azure AD group, making it easier to manage access for teams that need to work on multiple applications.
Use existing identity infrastructure: If your company already uses Azure AD for other purposes, you can leverage your existing identity infrastructure to manage access to Azure App Center applications.
Here’s a breakdown of the other options:
A. Active Directory Federation Groups: This option is only applicable if your company is using Active Directory Federation Services (ADFS) to federate with Azure AD.
B. Active Directory Groups: This option is only applicable if your company is using on-premises Active Directory.
D. Visual Studio App Center distribution Groups: This option is specific to Visual Studio App Center and does not provide the same level of flexibility and scalability as Azure AD groups.
Therefore, Azure AD groups are the best option for managing access to applications in Azure App Center at the organization level.
Azure Active Directory (Azure AD) groups provide the most flexible and scalable way to manage access to applications in Azure App Center. By using Azure AD groups, you can:
Centralize management: Azure AD groups can be managed from a central location, making it easier to control access to applications.
Grant access to multiple applications: You can add multiple applications to a single Azure AD group, making it easier to manage access for teams that need to work on multiple applications.
Use existing identity infrastructure: If your company already uses Azure AD for other purposes, you can leverage your existing identity infrastructure to manage access to Azure App Center applications.
Here’s a breakdown of the other options:
A. Active Directory Federation Groups: This option is only applicable if your company is using Active Directory Federation Services (ADFS) to federate with Azure AD.
B. Active Directory Groups: This option is only applicable if your company is using on-premises Active Directory.
D. Visual Studio App Center distribution Groups: This option is specific to Visual Studio App Center and does not provide the same level of flexibility and scalability as Azure AD groups.
Therefore, Azure AD groups are the best option for managing access to applications in Azure App Center at the organization level.