ratl.ai
Try Now ↗
  • Welcome
  • Getting Started
    • Quickstart
    • Understanding Testing Modes
  • MANAGE ACCOUNT
    • Organizations and Workspaces
    • Managing API Keys
  • FEATURES
    • Overview
    • Understanding Dashboard: A Complete Guide
    • Talking to Agents: A Comprehensive Guide for Interactions
    • API Functional Tests: A Comprehensive Solution for Automated Testing
    • Load and Performance Testing: A Comprehensive Solution for High Traffic Simulation
    • API Integration E2E Tests: A Comprehensive Solution for Automated Integration Testing
    • Web Automation: AI-Powered Visual Testing Solution
    • Schedules: Automate and Manage Test Execution Timing
    • Run History: Tracking and Managing Your Test Executions
  • INTEGRATIONS
    • Overview
    • CI/CD Integration: Automate Testing in Your Development Pipelines
    • Slack Integration: Real-Time Test Updates and Collaboration
    • Azure DevOps (ADO) Integration: Streamline Bug Tracking
    • Jira Integration: Simplify Bug Tracking and Test Management
    • Kubernetes Integration: Deploy Ratl Load Injectors On-Premises
  • CREDITS
    • Credit Management Overview
  • PUBLIC APIs
    • API Documentation
Powered by GitBook
On this page
  • Overview
  • Assisted Mode
  • Autonomous Mode
  • Summary of Differences
  1. Getting Started

Understanding Testing Modes


Overview

ratl.ai provides two distinct modes to help teams optimize their software testing workflows:

  1. Assisted Mode – Ideal for modular testing where users have full control.

  2. Autonomous Mode – Designed for end-to-end automation with minimal manual setup.


Assisted Mode

Use Case: Best for QA teams and developers who want to test APIs, Web apps, Load, and Performance as separate tasks.

Setup Flow

  1. Mode Selection User selects "Assisted Mode" from the initial screen.

  2. Workspace Setup(if first user to the organisation)

    • Name the workspace

    • Define organization URL

    • (Optional) Add a workspace description

    • Enable auto-join for users with a specific domain email

  3. Workspace Discovery

    • Shows existing workspaces within the org

    • Option to join or create a new one

  4. Team Invitation

  • Invite teammates by entering email addresses

  • Option to skip this step

  1. User Role Setup

    • Enter user name

    • Choose from roles: QA, Developer, Product Manager, Tech Lead, CXO, Designer, DevOps, TechOps, SiteOps, Other

  2. Test Type Selection Final step before reaching the dashboard. Choose from:

    • API Functional Testing

    • API Integration (E2E) Testing

    • Load & Performance Testing

    • Web Application Testing

Once a test type is selected, the user is directed to the dashboard to begin testing.


Autonomous Mode

Use Case: Built for teams that prefer a hands-off, intelligent testing flow using context-driven automation.

Setup Flow

  1. Mode Selection User selects "Autonomous Mode" from the initial screen.

  2. Workspace Setup Same as Assisted Mode (workspace name, org URL, invite team, join existing workspace, and set user role).

  3. SDLC Context Collection The user selects relevant tools and practices from their software development lifecycle (SDLC). Each selected item contributes to a projected context score:

    • Requirements / User Stories Tools like JIRA, ClickUp, Notion

    • Design/UX Artifacts Tools like Figma, Miro, Whimsical

    • API Documentation Tools like Swagger, Postman, Stoplight, Insomnia

    • Code Repository Tools like GitHub, GitLab, Bitbucket

    • Code Quality / Static Analysis Tools like SonarQube, ESLint, Codacy

    • AI Code Assistants Tools like GitHub Copilot, Cursor, TabNine, CodeWhisperer

    • CI/CD Tools like Jenkins, GitHub Actions, CircleCI

    • Analytics Tools Tools like GA, Mixpanel, Heap, Amplitude

    • Test Environment Tools like Docker, Kubernetes

    • Monitoring & APM Tools like New Relic, Datadog, Prometheus, Sentry

  4. Platform Integration The user further integrates their tool stack by selecting platforms used across project management, testing, collaboration, and environments.

  5. Context Score Assessment User receives a Projected Context Score (0–100) based on SDLC coverage and tool integrations. Score factors include:

    • Test Planning

    • Code Quality

    • API Documentation

    • CICD Context & Readiness

    • Test Environment Context

    Each section is scored to help identify gaps and optimize automation readiness.

End of Setup

After the context score is shown, users are ready to go autonomous and will be directed to the dashboard.


Summary of Differences

Feature
Assisted Mode
Autonomous Mode

Control

Manual test selection

Fully automated testing flow

Integration Required

Minimal

Extensive (SDLC-wide)

Best For

Modular, focused testing

Holistic, continuous testing

Setup Depth

4 steps

Up to 6 steps including integrations and scoring

Outcome

Manual dashboard launch

Auto-generated test strategy & dashboard


For more, visit: Learn more about Assisted and Autonomous Modes (as shown in the UI).

PreviousQuickstartNextOrganizations and Workspaces

Last updated 1 month ago