Exam Associate Cloud Engineer
Exam Number: Associate Cloud Engineer | Length of test: 2 hours |
Exam Name: Associate Cloud Engineer | Number of questions in the actual exam: +50 |
Format: PDF, VPLUS | Passing Score: +70% |
Total Questions: 315
FREE
Premium VPLUS file
Download practice test questions
Title | Size | Hits | Download |
---|---|---|---|
Google.Associate Cloud Engineer.vJun-2024.by.Reen162q | 4.88 MB | 82 | Download |
Google.Associate Cloud Engineer.vJan-2024.by.Evir.97q | 3.12 MB | 78 | Download |
Google.Associate Cloud Engineer.vNov-2023.by.Misa.77q | 1.72 MB | 77 | Download |
Google.Associate Cloud Engineer.vNov-2023.by.Misa.77q | 5.90 MB | 77 | Download |
Google.Associate Cloud Engineer.vJan-2024.by.Evir.97q | 593.31 KB | 84 | Download |
Some new sample questions:
Q292
Your digital media company stores a large number of video files on-premises. Each video file ranges from 100 MB to 100 GB. You are currently storing 150 TB of video data in your on-premises network, with no room for expansion. You need to migrate all infrequently accessed video files older than one year to Cloud Storage to ensure that on-premises storage remains available for new files. You must also minimize costs and control bandwidth usage. What should you do?
A. Create a Cloud Storage bucket. Establish an Identity and Access Management (IAM) role with write permissions to the bucket. Use the gsutil tool to directly copy files over the network to Cloud Storage.
B. Set up a Cloud Interconnect connection between the on-premises network and Google Cloud. Establish a private endpoint for Filestore access. Transfer the data from the existing Network File System (NFS) to Filestore.
C. Use Transfer Appliance to request an appliance. Load the data locally, and ship the appliance back to Google for ingestion into Cloud Storage.
D. Use Storage Transfer Service to move the data from the selected on-premises file storage systems to a Cloud Storage bucket.
Q293
You are developing an internet of things (IoT) application that captures sensor data from multiple devices that have already been set up. You need to identify the global data storage product your company should use to store this data. You must ensure that the storage solution you choose meets your requirements of sub-millisecond latency. What should you do?
A. Store the IoT data in Spanner. Use caches to speed up the process and avoid latencies.
B. Store the IoT data in Bigtable.
C. Capture IoT data in BigQuery datasets.
D. Store the IoT data in Cloud Storage. Implement caching by using Cloud CDN.
Q295
You need to migrate multiple PostgreSQL databases from your on-premises data center to Google Cloud. You want to significantly improve the performance of your databases while minimizing changes to your data schema and application code. You expect to exceed 150 TB of data per geographical region. You want to follow Google-recommended practices and minimize your operational costs. What should you do?
A. Migrate your data to AlloyDB.
B. Migrate your data to Spanner.
C. Migrate your data to Firebase.
D. Migrate your data to Bigtable.
……
Some new quesitons:
Q
Your web application is hosted on Cloud Run and needs to query a Cloud SOL database. Every morning during a traffic spike, you notice API quota errors in Cloud SOL logs. The project has already reached the maximum API quota. You want to make a configuration change to mitigate the issue. What should you do?
A. Modify the minimum number of Cloud Run instances.
B. Set a minimum concurrent requests environment variable for the application.
C. Modify the maximum number of Cloud Run instances.
D. Use traffic splitting.
Q
Your company is running a three-tier web application on virtual machines that use a MySQL database. You need to create an estimated total cost of cloud infrastructure to run this application on Google Cloud instances and Cloud SQL. What should you do?
A. Use the Google Cloud Pricing Calculator to determine the cost of every Google Cloud resource you expect to use. Use similar size instances for the web server, and use your current on-premises machines as a comparison for Cloud SQL.
B. Implement a similar architecture on Google Cloud, and run a reasonable load test on a smaller scale. Check the billing information, and calculate the estimated costs based on the real load your system usually handles.
C. Use the Google Cloud Pricing Calculator and select the Cloud Operations template to define your web application with as much detail as possible.
D. Create a Google spreadsheet with multiple Google Cloud resource combinations. On a separate sheet, import the current Google Cloud prices and use these prices for the calculations within formulas.
Q
You need to deploy a single stateless web application with a web interface and multiple endpoints. For security reasons, the web application must be reachable from an internal IP address from your company’s private VPC and on-premises network. You also need to update the web application multiple times per day with minimal effort and want to manage a minimal amount of cloud infrastructure. What should you do?
A. Deploy the web application on Google Kubernetes Engine standard edition with an internal ingress.
B. Deploy the web application on Cloud Run with Private Google Access configured
C. Deploy the web application to GKE Autopilot with Private Google Access configured
D. Deploy the web application on Cloud Run with Private Service Connect configured.
Q
Your application stores files on Cloud Storage by using the Standard Storage class. The application only requires access to files created in the last 30 days. You want to automatically save costs on files that are no longer accessed by the application. What should you do?
A. Create a retention policy on the storage bucket of 30 days, and lock the bucket by using a retention policy lock.
B. Enable object versioning on the storage bucket and add lifecycle rules to expire non-current versions after 30 days
C. Create an object lifecycle on the storage bucket to change the storage class to Archive Storage for objects with an age over 30 days.
D. Create a cron job in Cloud Scheduler to call a Cloud Functions instance every day to delete files older than 30 days.
…………..