DVA-C02 Exam
Exam Number: DVA-C02 | Length of test: 130 mins |
Exam Name: AWS Certified Developer – Associate | Number of questions in the actual exam: 65 |
Format: PDF, VPLUS | Passing Score: 720/1000 |
Total Questions: 371
Premium PDF file 2 months updates Last updated: March-2025 |
![]() Total Questions: 371 FREE Premium VPLUS file Last updated: March-2025 |
Download practice test questions – DVA-C02 exam topic
Title | Size | Hits | Download |
---|---|---|---|
Amazon.DVA-C02.vJan-2025.by.Tondissim.111q | 795.00 KB | 32 | Download |
Amazon.DVA-C02.vJan-2025.by.Tondissim.111q | 189.16 KB | 26 | Download |
Amazon.DVA-C02.vOct-2024.by.Lien.99q | 698.60 KB | 46 | Download |
Amazon.DVA-C02.vOct-2024.by.Lien.99q | 141.36 KB | 35 | Download |
Amazon.Vdumps.DVA-C02.127q | 1.36 MB | 94 | Download |
Amazon.DVA-C02.vSep-2023.by.Roly.89q | 432.02 KB | 102 | Download |
Some new sample questions:
Question:
A developer is building various microservices for an application that will run on Amazon EC2 instances. The developer needs to monitor the end-to-end view of the requests between the microservices and debug any issues in the various microservices.
What should the developer do to accomplish these tasks?
A. Use Amazon CloudWatch to aggregate the microservices’ logs and metrics, and build the monitoring dashboard.
B. Use AWS CloudTrail to aggregate the microservices’ logs and metrics, and build the monitoring dashboard.
C. Use the AWS X-Ray SDK to add instrumentation in all the microservices, and monitor using the X-Ray service map.
D. Use AWS Health to monitor the health of all the microservices.
Question:
A company has an application that uses an Amazon S3 bucket for object storage. A developer needs to configure in-transit encryption for the S3 bucket. All the S3 objects containing personal data needs to be encrypted at rest with AWS KMS keys, which can be rotated on demand.
Which combination of steps will meet these requirements? (Select TWO.)
A. Write an S3 bucket policy to allow only encrypted connections over HTTPS by using permissions boundary.
B. Configure an S3 bucket policy to enable client-side encryption for the objects containing personal data by using an AWS KMS customer managed key
C. Configure the application to encrypt the objects by using an AWS KMS customer managed key before uploading the objects containing personal data to Amazon S3.
D. Write an S3 bucket policy to allow only encrypted connections over HTTPS by using the aws:SecureTransport condition.
E. Configure S3 Block Public Access settings for the S3 bucket to allow only encrypted connections over HTTPS.
Question:
A developer is automating a new application deployment with AWS SAM. The new application has one AWS Lambda function and one Amazon S3 bucket. The Lambda function must access the S3 bucket to only read objects.
How should the developer configure AWS SAM to grant the necessary read permission to the S3 bucket?
A. Reference a second Lambda authorizer function.
B. Add a custom S3 bucket policy to the Lambda function.
C. Create an Amazon SQS topic for only S3 object reads. Reference the topic in the template.
D. Add the S3ReadPolicy template to the Lambda function’s execution role.
……….
Some new sample questions:
Question:
A developer is building a three-tier web application that should be able to handle a minimum of 5000 requests per minute. Requirements state that the web tier should be completely stateless while the application maintains session state for the users.
How can session data be externalized, keeping latency at the LOWEST possible value?
A. Create an Amazon RDS instance, then implement session handling at the application level to leverage a database inside the RDS database instance for session data storage.
B. Implement a shared file system solution across the underlying Amazon EC2 instances, then implement session handling at the application level to leverage the shared file system for session data storage.
C. Create an Amazon ElastiCache (Memcached) cluster, then implement session handling at the application level to leverage the cluster for session data storage.
D. Create an Amazon DynamoDB table, then implement session handling at the application level to leverage the table for session data storage.
Question:
A company has a large amount of data in an Amazon DynamoDB table. A large batch of data is appended to the table once each day. The company wants a solution that will make all the existing and future data in DynamoDB available for analytics on a long-term basis.
Which solution meets these requirements with the LEAST operational overhead?
A. Configure DynamoDB incremental exports to Amazon S3.
B. Configure Amazon DynamoDB Streams to write records to Amazon S3.
C. Configure Amazon EMR to copy DynamoDB data to Amazon S3.
D. Configure Amazon EMR to copy DynamoDB data to Hadoop Distributed File System (HDFS).
Question:
An ecommerce company is planning to migrate an on-premises Microsoft SQL Server database to the AWS Cloud. The company needs to migrate the database to SQL Server Always On availability groups. The cloud-based solution must be highly available.
Which solution will meet these requirements?
A. Deploy three Amazon EC2 instances with SQL Server across three Availability Zones. Attach one Amazon Elastic Block Store (Amazon EBS) volume to the EC2 instances.
B. Migrate the database to Amazon RDS for SQL Server. Configure a Multi-AZ deployment and read replicas.
C. Deploy three Amazon EC2 instances with SQL Server across three Availability Zones. Use Amazon FSx for Windows File Server as the storage tier.
D. Deploy three Amazon EC2 instances with SQL Server across three Availability Zones. Use Amazon S3 as the storage tier.
Question:
A social media application is experiencing high volumes of new user requests after a recent marketing campaign. The application is served by an Amazon RDS for MySQL instance. A solutions architect examines the database performance and notices high CPU usage and many ‘too many connections’ errors that lead to failed requests on the database. The solutions architect needs to address the failed requests.
Which solution will meet this requirement?
A. Deploy an Amazon DynamoDB Accelerator (DAX) cluster. Configure the application to use the DAX cluster.
B. Deploy an RDS Proxy. Configure the application to use the RDS Proxy.
C. Migrate the database to an Amazon RDS for PostgreSQL instance.
D. Deploy an Amazon ElastiCache (Redis OSS) cluster. Configure the application to use the ElastiCache cluster.
………
Some new questions:
Q
A company is creating a new application that gives users the ability to upload and share short video files. The average size of the video files is 10 MB. After a user uploads a file, a message needs to be placed into an Amazon Simple Queue Service (Amazon SQS) queue so the file can be processed. The files need to be accessible for processing within 5 minutes.
Which solution will meet these requirements MOST cost-effectively?
A. Write the files to Amazon S3 Glacier Deep Archive. Add the S3 location of the files to the SQS queue.
B. Write the files to Amazon S3 Standard. Add the S3 location of the files to the SQS queue.
C. Write the files to an Amazon Elastic Block Store (Amazon EBS) General Purpose SSD volume. Add the EBS location of the files to the SQS queue.
D. Write messages that contain the contents of the uploaded files to the SQS queue.
Q
A developer needs to retrieve all data from an Amazon DynamoDB table that matches a particular partition key.
Which solutions will meet this requirement in the MOST operationally efficient way? (Select TWO.)
A. Use the Scan API and a filter expression to match on the key.
B. Use the GetItem API with a request parameter for key that contains the partition key name and specific key value.
C. Use the ExecuteStatement API and a filter expression to match on the key.
D. Use the GetItem API and a PartiQL statement to match on the key.
E. Use the ExecuteStatement API and a PartiQL statement to match on the key.
Q
A company has a web application that contains an Amazon API Gateway REST API. A developer has created an AWS CloudFormation template for the initial deployment of the application. The developer has deployed the application successfully as part of an AWS CodePipeline continuous integration and continuous delivery (CI/CD) process. All resources and methods are available through the deployed stage endpoint.
The CloudFormation template contains the following resource types:
* AWS::ApiGateway::RestApi
* AWS::ApiGateway::Resource
* AWS::ApiGateway::Method
* AWS:ApiGateway::Stage
* AWS::ApiGateway:;Deployment
The developer adds a new resource to the REST API with additional methods and redeploys the template. CloudFormation reports that the deployment is successful and that the stack is in the UPDATE_COMPLETE state. However, calls to all new methods are returning 404 (Not Found) errors.
What should the developer do to make the new methods available?
A. Specify the disable-rollback option during the update-stack operation.
B. Unset the Cloud Forma lion stack failure options.
C. Add an AWS CodeBuild stage lo CodePipeline to run the aws apigateway create-deployment AWS CLI command.
D. Add an action to CodePipeline to run the aws cloudfront create-invalidation AWS CLI command.
Q
A company is developing an application that will be accessed through the Amazon API Gateway REST API. Registered users should be the only ones who can access certain resources of this API. The token being used should expire automatically and needs to be refreshed periodically.
How can a developer meet these requirements?
A. Create an Amazon Cognito identity pool, configure the Amazon Cognito Authorizer in API Gateway, and use the temporary credentials generated by the identity pool.
B. Create and maintain a database record for each user with a corresponding token and use an AWS Lambda authorizer in API Gateway.
C. Create an Amazon Cognito user pool, configure the Cognito Authorizer in API Gateway, and use the identity or access token.
D. Create an 1AM user for each API user, attach an invoke permissions policy to the API. and use an I AM authorizer in API Gateway.
…….