Certifications Tools Exam Guides Blog Pricing
Start for free
AWS

AWS Developer Associate Score Report Explained — How Close Were You Really?

How do I read my AWS DVA-C02 score report?

Your DVA-C02 score report uses scaled scoring from 100–1000 with a passing threshold of 720. A score between 680–719 does not mean you were ‘one question away’—scaled scoring makes the gap larger than it appears. Focus on domains with the lowest performance bars for your retake preparation.

You’re staring at your score report wondering what the numbers actually mean. Here’s the uncomfortable truth: if you scored between 680 and 719, you weren’t “one question away” from passing. The gap is real, even if it feels small. Let me explain what your report is actually telling you.

How AWS Actually Scores This Exam

AWS uses something called scaled scoring. Your score isn’t a simple percentage of correct answers — it’s a number between 100 and 1000 that’s been adjusted to account for different question difficulties across different exam versions. The goal is fairness: someone who got a harder question pool shouldn’t be penalized compared to someone with an easier one.

Why Your Score Isn’t a Percentage

Here’s what the scaling means practically:

  • Two people answering the same number of questions correctly might get different scores if their question pools differed in difficulty
  • You can’t calculate your raw score from your scaled score without internal AWS data
  • A 690 on one exam version isn’t directly comparable to a 690 on another

The passing threshold is 720 out of 1000. AWS doesn’t publish what raw percentage this typically represents, but based on exam structure, it’s usually around 70–75% correct.

Why AWS Never Shows Specific Questions

You won’t see which questions you got wrong, how many you answered correctly, or any difficulty ratings. This is intentional — if AWS showed that information, it would spread online and compromise the exam.

The trade-off is that you get less actionable data than you’d probably want. You’re working with domain-level feedback, not question-level analysis.

Why Some Exam Versions Feel Harder

AWS rotates multiple exam versions. Some lean heavily toward Lambda and serverless; others emphasize DynamoDB or CI/CD. If your version happened to focus on your weaker areas, it felt harder — and your performance probably suffered.

The scaled scoring partially compensates for this, but not perfectly. The subjective experience of “that exam was brutal” is often real.

Understanding Those Domain Bars

Your score report has horizontal bars showing performance in each domain. They use labels like “Needs Improvement,” “Near Target,” or “Meets Target.” Here’s what they actually mean.

What the Labels Really Tell You

The bars don’t show percentages. They show relative performance against an internal benchmark AWS doesn’t share. “Near target” means you answered some questions correctly in that domain, but not enough to meet competency expectations.

DVA-C02 has four domains with these weightings:

  • Domain 1: Development with AWS Services (32%)
  • Domain 2: Security (26%)
  • Domain 3: Deployment (24%)
  • Domain 4: Troubleshooting and Optimization (18%)

A “below target” in Domain 1 hurts more than in Domain 4 because Domain 1 carries more weight.

The “Near Target Everywhere” Trap

Here’s something a lot of people misunderstand: if every domain shows “near target,” that’s not good news. It means you were consistently below the competency threshold across the entire exam. You didn’t have compensating strengths in some areas that offset weaknesses in others.

Multiple “near target” results usually mean a failing score, even if none of them look catastrophic on their own.

Common Misinterpretations

“I was strong in three domains, so I should have passed.”
Not necessarily. If your one weak domain was heavily weighted and you scored well below target, it can drag your overall score below 720.

“My weakest domain was only ‘near target,’ so I was close."
"Near target” is still below the competency threshold. Multiple near-target domains compound into a real gap.

“The bars look almost equal, so I almost passed.”
The visual representation isn’t proportional to your actual gap from passing. Small differences in bar length can mean significant score differences. For more on why candidates misread their performance, understanding structural failure causes helps.

Why Practice Scores Don’t Match Reality

Lots of candidates score 80%+ on practice exams and then fail DVA-C02 with a 680. This gap is common and here’s why.

Practice Exams Are Often Easier

Practice exam question pools don’t fully match the real thing. They often ask more direct knowledge questions and fewer scenario-based decision questions. High practice scores can create false confidence.

The Question Style Is Different

DVA-C02 emphasizes scenarios that require reading carefully for constraints, eliminating options that violate those constraints, and choosing between multiple technically valid answers based on AWS best practices.

Practice exams often test if you know a fact. The real exam tests if you can apply that fact correctly. These are different cognitive skills.

The Overfitting Problem

If you took the same practice exam multiple times, you probably memorized answers rather than understanding reasoning. High practice scores but low real scores often indicate pattern recognition without true understanding.

If You Scored 680–719, How Close Were You?

A lot of candidates ask: “Was I one question away from passing?” The answer is almost always no.

What That Score Range Actually Means

A score of 680–719 suggests you demonstrated competency in some areas but fell meaningfully short in others. In raw terms, you probably missed passing by 3–8 questions, not 1–2.

At this level, you have foundational AWS development knowledge but gaps in either depth or decision-making ability.

Why “Almost Passed” Is Still a Real Gap

A near-pass isn’t bad luck. It reflects a real gap between your capability and exam requirements. That gap might feel small emotionally, but it represents material you didn’t master.

Treating a near-pass as “I just need to try again” without changing your prep usually produces the same result. The exam tests the same competencies. If you don’t close the gap, your score won’t meaningfully change.

How to Actually Use This Report

What It’s Good For

Your domain breakdown shows where to focus. If Domain 2 (Security) shows “needs improvement” while others are better, that’s clear guidance: you need to study IAM, Cognito, encryption, and secrets management more thoroughly.

Use it to prioritize. Don’t study everything equally. Spend more time on weak domains. Once you’ve identified gaps, you can build a retake plan that addresses them systematically.

What It Can’t Tell You

The score report doesn’t tell you:

  • Which specific topics within a domain you missed
  • Whether you struggled with knowledge gaps or decision-making
  • How to study differently
  • What your actual raw score was

You’ll need to fill in gaps from your exam experience. Which questions felt uncertain? Which scenarios confused you? Combine the domain report with your memory of the exam for a more complete picture.


Your score report is a diagnostic tool, not a verdict. It tells you where you fell short without revealing exactly why. Use it to guide your retake prep, but don’t over-interpret the visual bars or assume a near-pass means minimal effort is needed. The gap is real, and closing it takes targeted, honest work.