Open Guide for AWS Certified Data Analytics - Specialty (DAS-C01) Exam
The AWS Certified Data Analytics - Specialty (DAS-C01) examination is intended for individuals who perform in a data analytics-focused role. This exam validates an examinee’s comprehensive understanding of using AWS services to design, build, secure, and maintain analytics solutions that provide insight from data.
It validates an examinee’s ability to:
- Define AWS data analytics services and understand how they integrate with each other.
- Explain how AWS data analytics services fit in the data lifecycle of collection, storage, processing, and visualization.
- A minimum of 5 years of experience with common data analytics technologies
- At least 2 years of hands-on experience working on AWS
- Experience and expertise working with AWS services to design, build, secure, and maintain analytics solutions
There are two types of questions on the examination:
- Multiple choice: Has one correct response and three incorrect responses (distractors).
- Multiple response: Has two or more correct responses out of five or more options.
Domain | % of Questions |
---|---|
Domain 1: Collections | 18% |
Domain 2: Storage and Data Management | 22% |
Domain 3: Processing | 24% |
Domain 4: Analysis and Visualization | 18% |
Domain 4: Analysis and Visualization | 18% |
Domain 5: Security | 18% |
Total | 100% |
- Determine the operational characteristics of the collection system
- Select a collection system that handles the frequency, volume, and source of data
- Select a collection system that addresses the key properties of data, such as order, format, and compression
- Determine the operational characteristics of a storage solution for analytics
- Determine data access and retrieval patterns
- Select an appropriate data layout, schema, structure, and format
- Define a data lifecycle based on usage patterns and business requirements
- Determine an appropriate system for cataloging data and managing metadata
- Determine appropriate data processing solution requirements
- Design a solution for transforming and preparing data for analysis
- Automate and operationalize a data processing solution
- Determine the operational characteristics of an analysis and visualization solution
- Select the appropriate data analysis solution for a given scenario
- Select the appropriate data visualization solution for a given scenario
- Select appropriate authentication and authorization mechanisms
- Apply data protection and encryption techniques
- Apply data governance and compliance controls
🐱 :cow:
-
Big Data Analytics Architecture Patterns & Best Practices
- Videos
- Presentation Slide
- Topics
- Types of Big Data Anaytics
- Delivery Models
- Architectural Principles
- Big Data Pipeline
- ELT/ETL on AWS
-
Whitepapers
- 📒 Marks standard/official AWS pages and docs
- 🔹 Important or often overlooked tip
- ❗ “Serious” gotcha (used where risks or time or resource costs are significant: critical security risks, mistakes with significant financial cost, or poor architectural choices that are fundamentally difficult to correct)
- 🔸 “Regular” gotcha, limitation, or quirk (used where consequences are things not working, breaking, or not scaling gracefully)
- 📜 Undocumented feature (folklore)
- 🐥 Relatively new (and perhaps immature) services or features
- ⏱ Performance discussions
- ⛓ Lock-in: Products or decisions that are likely to tie you to AWS in a new or significant way — that is, later moving to a non-AWS alternative would be costly in terms of engineering effort
- 🚪 Alternative non-AWS options
- 💸 Cost issues, discussion, and gotchas
- 🕍 A mild warning attached to “full solution” or opinionated frameworks that may take significant time to understand and/or might not fit your needs exactly; the opposite of a point solution (the cathedral is a nod to Raymond’s metaphor)
- 📗📘📙 Colors indicate basics, tips, and gotchas, respectively.
- 🚧 Areas where correction or improvement are needed (possibly with link to an issue — do help!)