Skip to content

Test Coverage Analysis Prompt for Ruby on Rails Project

Back to main README

Use the cov-loupe MCP server tool to analyze the test coverage data generated by simplecov for this Ruby on Rails project. Generate a detailed coverage analysis report and output it to a markdown file named test_coverage_analysis.md.

The report should include:

1. Executive Summary

  • Overall coverage percentage and trend direction
  • High-level assessment of testing health across the Rails application
  • Key strengths and critical gaps at a glance

2. Coverage by Rails Component

Detailed analysis of coverage across different Rails layers: - Models: ActiveRecord models, validations, scopes, associations, callbacks - Controllers: Actions, before_actions, strong parameters, response handling - Views: View helpers, partials, component logic - Mailers: Email templates and delivery logic - Jobs: ActiveJob background jobs and queue handlers - Services/POROs: Business logic classes and service objects - Concerns: Shared modules and mixins - Lib files: Custom libraries and utilities - Configuration: Initializers and config files (if applicable)

3. Well-Tested Areas

  • What's working well and why
  • Testing patterns and practices to replicate
  • Examples of strong test coverage with specific file references

4. Poorly-Tested Areas

Organize by: - Complexity to Test: Simple/Moderate/Complex (considering Rails dependencies, database setup, external services, etc.) - Risk Level: High/Medium/Low (impact on user experience, data integrity, security) - Coverage Gap Size: Percentage of uncovered lines and number of untested methods

5. Priority Issues Table

Create a markdown table with these columns: - File/Module Path - Component Type (Model/Controller/Job/etc.) - Current Coverage % - Uncovered Lines Count - Risk Level (High/Medium/Low) - Complexity to Fix (High/Medium/Low) - Priority Score (1-10, with 10 being highest priority) - Recommended Action (specific next steps)

Sort by Priority Score descending.

6. Rails-Specific Testing Analysis

  • Model Testing: Coverage of validations, associations, scopes, callbacks, custom methods
  • Controller Testing: Coverage of happy paths vs error handling, authentication/authorization
  • Integration Testing: Request specs, feature specs, system tests
  • API Testing: Coverage of JSON/XML responses, API versioning
  • Database Operations: Testing of migrations, seeds, complex queries
  • Background Jobs: Coverage of job execution, retry logic, error handling
  • Security: Testing of authentication, authorization, input sanitization, CSRF protection
  • Missing Test Types: Identify gaps in unit/integration/system/feature tests

7. Common Rails Testing Anti-Patterns Detected

  • Over-reliance on controller tests vs request specs
  • Missing edge case coverage in models
  • Untested error handling and failure scenarios
  • Missing tests for background jobs and mailers
  • Lack of integration tests for critical user flows
  • Untested authorization/permission logic

8. Actionable Testing Roadmap

Prioritized list of specific issues to resolve, organized by effort level: - Quick Wins (< 2 hours): Simple tests with high impact - Sprint 1 (2-8 hours): Medium complexity, high priority - Sprint 2 (1-2 days): Complex but critical coverage gaps - Long-term (> 2 days): Large refactoring or architectural test improvements

Include specific file paths, method names, and recommended testing approaches for each item.

9. Metrics Dashboard

Key numbers: - Overall coverage percentage - Number of files at 0% coverage - Number of files at 100% coverage - Number of files above 90% coverage - Number of files below 50% coverage - Coverage by component type (Models %, Controllers %, etc.) - Trend direction (if historical data available) - Lines of code vs lines of test code ratio

10. Risk Assessment

  • Critical Paths: Identify core user journeys and their test coverage
  • High-Risk Untested Code: Payment processing, authentication, data deletion, etc.
  • Security Implications: Areas where lack of tests could lead to vulnerabilities
  • Technical Debt: Old code with no tests that should be prioritized for coverage

Include specific file paths, line numbers, and code examples where relevant. Focus on actionable insights rather than just reporting numbers. Provide concrete recommendations for improving the test suite architecture and coverage strategy.