Google Associate Cloud Engineer Exam Page2(Dumps)
Question No:-11
|
You need a dynamic way of provisioning VMs on Compute Engine. The exact specifications will be in a dedicated configuration file. You want to follow Google's recommended practices. Which method should you use?
1. Deployment Manager
2. Cloud Composer
3. Managed Instance Group
4. Unmanaged Instance Group
|
Question No:-12
|
You have a Dockerfile that you need to deploy on Kubernetes Engine. What should you do?
1. Use kubectl app deploy <dockerfilename>.
2. Use gcloud app deploy <dockerfilename>.
3. Create a docker image from the Dockerfile and upload it to Container Registry. Create a Deployment YAML file to point to that image. Use kubectl to create the deployment with that file.
4. Create a docker image from the Dockerfile and upload it to Cloud Storage. Create a Deployment YAML file to point to that image. Use kubectl to create the deployment with that file.
|
Question No:-13
|
Your development team needs a new Jenkins server for their project. You need to deploy the server using the fewest steps possible. What should you do?
1. Download and deploy the Jenkins Java WAR to App Engine Standard.
2. Create a new Compute Engine instance and install Jenkins through the command line interface.
3. Create a Kubernetes cluster on Compute Engine and create a deployment with the Jenkins Docker image.
4. Use GCP Marketplace to launch the Jenkins solution.
|
Question No:-14
|
You need to update a deployment in Deployment Manager without any resource downtime in the deployment. Which command should you use?
1. gcloud deployment-manager deployments create --config <deployment-config-path>
2. gcloud deployment-manager deployments update --config <deployment-config-path>
3. gcloud deployment-manager resources create --config <deployment-config-path>
4. gcloud deployment-manager resources update --config <deployment-config-path>
|
Question No:-15
|
You need to run an important query in BigQuery but expect it to return a lot of records. You want to find out how much it will cost to run the query. You are using on-demand pricing. What should you do?
1. Arrange to switch to Flat-Rate pricing for this query, then move back to on-demand.
2. Use the command line to run a dry run query to estimate the number of bytes read. Then convert that bytes estimate to dollars using the Pricing Calculator.
3. Use the command line to run a dry run query to estimate the number of bytes returned. Then convert that bytes estimate to dollars using the Pricing Calculator.
4. Run a select count (*) to get an idea of how many records your query will look through. Then convert that number of rows to dollars using the Pricing Calculator.
|
Question No:-16
|
You have a single binary application that you want to run on Google Cloud Platform. You decided to automatically scale the application based on underlying infrastructure CPU usage. Your organizational policies require you to use virtual machines directly. You need to ensure that the application scaling is operationally efficient and completed as quickly as possible. What should you do?
1. Create a Google Kubernetes Engine cluster, and use horizontal pod autoscaling to scale the application.
2. Create an instance template, and use the template in a managed instance group with autoscaling configured.
3. Create an instance template, and use the template in a managed instance group that scales up and down based on the time of day.
4. Use a set of third-party tools to build automation around scaling the application up and down, based on Stackdriver CPU usage monitoring.
Answer:-2. Create an instance template, and use the template in a managed instance group with autoscaling configured.
|
|
Question No:-17
|
You are analyzing Google Cloud Platform service costs from three separate projects. You want to use this information to create service cost estimates by service type, daily and monthly, for the next six months using standard query syntax. What should you do?
1. Export your bill to a Cloud Storage bucket, and then import into Cloud Bigtable for analysis.
2. Export your bill to a Cloud Storage bucket, and then import into Google Sheets for analysis.
3. Export your transactions to a local file, and perform analysis with a desktop tool.
4. Export your bill to a BigQuery dataset, and then write time window-based SQL queries for analysis.
Answer:-4. Export your bill to a BigQuery dataset, and then write time window-based SQL queries for analysis.
|
|
Question No:-18
|
You need to set up a policy so that videos stored in a specific Cloud Storage Regional bucket are moved to Coldline after 90 days, and then deleted after one year from their creation. How should you set up the policy?
1. Use Cloud Storage Object Lifecycle Management using Age conditions with SetStorageClass and Delete actions. Set the SetStorageClass action to 90 days and the Delete action to 275 days (365 ג€" 90)
2. Use Cloud Storage Object Lifecycle Management using Age conditions with SetStorageClass and Delete actions. Set the SetStorageClass action to 90 days and the Delete action to 365 days.
3. Use gsutil rewrite and set the Delete action to 275 days (365-90).
4. Use gsutil rewrite and set the Delete action to 365 days.
Answer:-1. Use Cloud Storage Object Lifecycle Management using Age conditions with SetStorageClass and Delete actions. Set the SetStorageClass action to 90 days and the Delete action to 275 days (365 ג€" 90)
|
|
Question No:-19
|
You have a Linux VM that must connect to Cloud SQL. You created a service account with the appropriate access rights. You want to make sure that the VM uses this service account instead of the default Compute Engine service account. What should you do?
1. When creating the VM via the web console, specify the service account under the 'Identity and API Access' section.
2. Download a JSON Private Key for the service account. On the Project Metadata, add that JSON as the value for the key compute-engine-service- account.
3. Download a JSON Private Key for the service account. On the Custom Metadata of the VM, add that JSON as the value for the key compute-engine- service-account.
4. Download a JSON Private Key for the service account. After creating the VM, ssh into the VM and save the JSON under ~/.gcloud/compute-engine-service- account.json.
|
Question No:-20
|
You are deploying an application to App Engine. You want the number of instances to scale based on request rate. You need at least 3 unoccupied instances at all times. Which scaling type should you use?
1. Manual Scaling with 3 instances.
2. Basic Scaling with min_instances set to 3.
3. Basic Scaling with max_instances set to 3.
4. Automatic Scaling with min_idle_instances set to 3.
|
|