Skip to main content

Overview

Before publishing a workflow to production, you should test it thoroughly to ensure all screens, transitions, automations, and data fields work as expected. ERPLite provides a test mode that simulates real task execution.

Safe Environment

Test without affecting production data

Real Simulation

Experience the workflow as users will

Debug Mode

See automation logs and error details

Quick Iteration

Make changes and re-test immediately

Starting Test Mode

1

Open Your Workflow

Navigate to the workflow you want to test in the workflow builder
2

Click Test

Click the Test button in the top toolbar
3

Configure Test Data

Fill in any create-time data fields if required
4

Start Test

Click Start Test to begin the simulation

Test Mode Features

Mobile Preview

The test mode shows a mobile device preview where you can:
  • Navigate through screens
  • Fill in form fields
  • Tap transition buttons
  • See how the UI looks on mobile

Data Field Inspector

View and modify data field values during testing:
FeatureDescription
Current ValuesSee all data field values at any point
Edit ValuesManually change values to test edge cases
ResetReset all values to start fresh

Automation Logs

See detailed logs of automation execution:
  • Which automations triggered
  • Condition evaluation results
  • API call requests and responses
  • Error messages and stack traces

What to Test

Screen Navigation

  • All transitions navigate to the correct screens
  • Back navigation works as expected
  • Decision nodes route correctly based on conditions
  • Terminal steps properly complete the task

Data Collection

  • Required fields show validation errors when empty
  • Data types are enforced (numbers, dates, etc.)
  • Data persists when navigating between screens
  • Images and files upload correctly

Automations

  • Screen on-load automations execute
  • Transition automations trigger at the right time
  • API calls succeed with valid responses
  • Error handling works when APIs fail
  • Validations block transitions when conditions aren’t met

UI Components

  • All components render correctly
  • Conditional visibility works (show/hide)
  • Dropdowns and lists have correct options
  • Read-only fields display data properly

Testing Edge Cases

Test with Different Data

Try your workflow with various data scenarios:
ScenarioPurpose
Empty valuesTest required field validation
Boundary valuesTest min/max limits
Special charactersTest text handling
Large filesTest upload limits

Test Decision Paths

For workflows with decision nodes, test every possible path:
  1. Identify all decision nodes
  2. Determine all possible outcomes
  3. Create test cases for each outcome
  4. Verify correct routing in each case

Test Error Scenarios

Intentionally trigger errors to verify handling:
  • Disconnect network to test offline behavior
  • Use invalid API endpoints
  • Enter invalid data formats
  • Test with expired tokens

Common Issues and Solutions

  • Verify the trigger event is correct
  • Check condition expressions for errors
  • Look for JavaScript syntax errors in scripts
  • Ensure data fields referenced exist
  • Verify the endpoint URL is correct
  • Check authentication headers
  • Validate request body format
  • Look for CORS issues in browser console
  • Ensure components are bound to data fields
  • Check that data field types match component types
  • Verify “Save Data to” field is configured
  • Review condition logic
  • Check data field values before the decision
  • Test condition expressions in isolation
  • Verify fallback/default paths

Test Reports

After testing, you can view a summary:
MetricDescription
Screens VisitedNumber of screens navigated
Automations RunCount of automations executed
ErrorsAny errors encountered
Time TakenDuration of test session

From Test to Production

Once testing is complete:
1

Review Results

Ensure all test cases passed
2

Fix Issues

Address any bugs or issues found
3

Re-test

Test again after making changes
4

Publish

When satisfied, publish the workflow (see Publishing)

For AI Agents

Test Mode API

POST /api/v1/workflows/:workflowId/test
Authorization: Bearer {token}
Content-Type: application/json

{
  "initialData": {
    "customer_name": "Test Customer",
    "order_amount": 100
  }
}
Response:
{
  "testSessionId": "test_abc123",
  "currentNode": "node_start",
  "dataFields": {...},
  "availableTransitions": ["NEXT"]
}

Execute Test Transition

POST /api/v1/workflows/test/:testSessionId/transition
Authorization: Bearer {token}
Content-Type: application/json

{
  "transitionName": "NEXT",
  "data": {
    "field_1": "value_1"
  }
}

Get Test Logs

GET /api/v1/workflows/test/:testSessionId/logs
Authorization: Bearer {token}

UI Components

ComponentLocationPurpose
TestMode/src/views/workflows/test/Test mode container
MobilePreview/src/views/workflows/test/MobilePreview/Device simulator
DataInspector/src/views/workflows/test/DataInspector/Data field viewer
AutomationLogs/src/views/workflows/test/AutomationLogs/Execution logs

Event Tracking

// Test mode events
WorkflowEvents.TEST_STARTED
WorkflowEvents.TEST_TRANSITION_EXECUTED
WorkflowEvents.TEST_AUTOMATION_EXECUTED
WorkflowEvents.TEST_ERROR
WorkflowEvents.TEST_COMPLETED