1. A company is using a VPC peering strategy to connect its VPCs in a single Region to allow for cross-communication. A recent increase in account creations and VPCs has made it difficult to maintain the VPC peering strategy, and the company expects to grow to hundreds of VPCs. There are also new requests to create site-to-site VPNs with some of the VPCs. A solutions architect has been tasked with creating a centrally networking setup for multiple accounts, VPCs, and VPNs.Which networking solution meets these requirements?
A) Configure a hub-and-spoke and route all traffic through VPC peering. B) Configure an AWS Direct Connect between all VPCs and VPNs. C) Configure a transit gateway with AWS Transit Gateway and connected all VPCs and VPNs. D) Configure shared VPCs and VPNs and share to each other
2. A company recently released a new type of internet-connected sensor. The company is expecting to sell thousands of sensors, which are designed to stream high volumes of data each second to a central location. A solutions architect must design a solution that ingests and stores data so that engineering teams can analyze it in near-real time with millisecond responsiveness. Which solution should the solutions architect recommend?
A) Use an Amazon SQS queue to ingest the data. Consume the data with an AWS Lambda function, which then stores the data in Amazon Redshift. B) Use an Amazon SOS queue to ingest the data. Consume the data with an AWS Lambda function, which then stores the data in Amazon DynamoDB. C) Use Amazon Kinesis Data Streams to ingest the data. Consume the data with an AWS Lambda function, which then stores the data in Amazon Redshift. D) Use Amazon Kinesis Data Streams to ingest the data. Consume the data with an AWS Lambda function, which then stores the data in Amazon DynamoDB.
3. An application running on AWS uses an Amazon Aurora Multi-AZ deployment for its database. When evaluating performance metrics, a solutions architect discovered that the database reads are causing high I/O and adding latency to the write requests against the database. What should the solutions architect do to separate the read requests from the write requests?
A) Create a second Amazon Aurora database and link it to the primary database as a read replica. B) Update the application to read from the Multi-AZ standby instance. C) Enable read-through caching on the Amazon Aurora database. D) Create a read replica and modify the application to use the appropriate endpoint.
4. A solutions architect is tasked with transferring 750 TB of data from a network-attached file system located at a branch office Amazon S3 Glacier. The solution must avoid saturating the branch office's low-bandwidth internet connection.What is the MOST cost-effective solution?
A) Order 10 AWS Snowball appliances and select an Amazon S3 bucket as the destination. Create a lifecycle policy to transition the S3 objects to Amazon S3 Glacier. B) Mount the network-attached file system to Amazon S3 and copy the files directly. Create a lifecycle policy to S3 objects to Amazon S3 Glacier. C) Order 10 AWS Snowball appliances and select an S3 Glacier vault as the destination. Create a bucket policy to enforce VPC endpoint. D) Create a site-to-site VPN tunnel to an Amazon S3 bucket and transfer the files directly. Create a bucket VPC endpoint.
5. A company's operations team has an existing Amazon S3 bucket configured to notify an Amazon SQS queue when new objects are created within the bucket. The development team also wants to receive events when new objects are created. The existing operations team workflow must remain intact.Which solution would satisfy these requirements?
A) Create a new SQS queue that only allows Amazon S3 access the queue. Update Amazon S3 to update tis queue written a new object is created. B) Create another SQS queue. Update the S3 events in theucket to also update the new queue when a new object is created. C) Create an Amazon SNS topic and SQS queue for the bucket updates. Update the bucket to send events to the new topic. Add subscription for both queues in the topic. D) Create an Amazon SNS topic and SAS queue for the bucket updates. Update the bucket to send events to the new topic. Updates queues to poll Amazon SNS.
1. Right Answer: C Explanation:
2. Right Answer: D Explanation: Analyze data in Amazon DynamoDB using Amazon SageMaker for real-time prediction | AWS Big Data Blog
3. Right Answer: D Explanation: Replication with Amazon Aurora - Amazon Aurora
Leave a comment