Treeify AI

User Guide

Product Name: Treeify Version: 0.2-beta Date: 2025.07.10

In this section, we'll explore how to use our web application effectively. Whether you're a new user or an experienced one, this section will guide you through various aspects of the application's usage.


User Interface

1. Dashboard

image.png

Here's what you’ll see and how to use each area:

📂 Recent Projects

Located at the center of the page, this section displays all your active and previously created projects in a card-style layout.

➕ Create Project

Click the green **“Create Project”**button (top-right of the Recent Projects panel) to start a new test design project.

💳 Current Balance

Located in the top-right panel, this shows your remaining monthly credit balance.

Each AI operation (e.g. test object analysis or scenario generation) consumes a small number of credits.

Early Access users receive 200 free credits per month.

📊 Test Analytics

Track your generation activity over time. You can view usage data by:

  • Today
  • Past 3 Days
  • Past 7 Days

This helps you understand how often and how deeply you're working with Treeify.

📝 Change Log / User Guide / Getting Started

At the bottom of the page are quick-access tiles to help you stay informed and supported:

  • Change Log — View the latest product updates
  • User Guide — Step-by-step guidance on each function
  • Getting Started — Return to the onboarding instructions

These resources are always available to help you make the most of Treeify.

2. Project Management

image.png

One of the primary uses of our web application is content creation and management. Here's how you can get started:

  1. Creating Directory:

    • Go to the project page
    • Click "Create Project," then select "Create Dictionary."
    • Enter the "Directory Name" and "Parent Directory," and confirm by clicking "OK."

    image.png

  2. Rename Directory:

    • Select an existing directory.
    • Click "Rename" to update its name.

    image.png

    • Enter the revised name of the Project Directiory.

image.png

  1. Delete a Directory

    • Click "Delete" to permanently remove the directory and its contents.

    Note: Deleting a directory will also remove all associated projects and subdirectories.

  2. Creating Project

    • Go to the project page.
    • Click "Create Project."
    • Enter the "Project Name" and any additional details for clarity.
    • Confirm by clicking "OK."

image.png

  1. Editing Project

Click the "…" menu to access available options:

image.png

  • Rename the Project: Update the name of an existing project.
  • Move the Project: Reorganize the project by moving it to a different directory.

image.png

  • Export the Project: Select desired results to export and choose a preferred file format.

3. Project Setting

3.1 Input Requirements

The first step in generating high-quality test cases with Treeify is providing clear and structured product requirements. Treeify supports two input methods to fit different workflow needs: Rich Text Input and File Upload.

✨ Input Methods

1. Type Text (Rich Text Editor)

Best for: Quick entry, lightweight modules, or when you don’t have a formal document.

  • Supports common formatting: bold, italic, headings, bullet points, etc.
  • Allows you to structure functional and non-functional requirements clearly.
  • Auto-saves your input to avoid accidental data loss.
  • Recommended to follow a clear structure (e.g., Feature → Input → Output → Errors).

Steps:

  1. Select the Type Text option.
  2. Enter or paste your requirements in the editor.
  3. Click Next Step to continue to Project Setting.

image.png

2. Upload File

Best for: Formal requirement documents (PRDs, functional specs), long and complex inputs.

  • Supports .docx, .pdf, and .txt formats.
  • Only one file is allowed per project.
  • Treeify will automatically parse the uploaded document to extract structured information.

Steps:

  1. Select the Upload File option.
  2. Drag and drop your file, or click to upload from local.
  3. Optionally download our formatting template as a reference.
  4. Once uploaded, click Next Step to proceed to Project Setting.

image.png

⚠️ Notes & Limitations

  • Choose one method only per project. You cannot use both text and file input together.
  • Be as specific and structured as possible—quality of input directly affects the accuracy and coverage of generated test cases.
  • Recommended structure:
    • Functional Description
    • Input/Output Behavior
    • Business Rules
    • Error Handling
    • Access Control
    • Non-functional Requirements ()

3.2 Project Setting

image.png

After you've input your requirements, the next step is configuring your project settings. This step ensures that the AI adapts to the domain context, test strategy, and coverage level best suited for your specific project.

The Project Setting page includes four major components:


  1. Industry

Select the domain or business context in which your system operates. Treeify uses this information to tailor test logic with appropriate terminology, process patterns, and domain assumptions.

Examples:

  • SaaS / Software Platforms
  • Finance
  • Healthcare
  • E-commerce
  • Government

Selecting the correct industry improves test relevance and reduces hallucination in complex workflows.


  1. Testing Stage

Choose the phase of testing your current design work corresponds to. This setting influences the granularity and focus of the generated scenarios.

Options include:

  • System Testing
  • Integration Testing
  • Acceptance Testing
  • UAT (User Acceptance Testing)
  • Regression Testing

Helps the AI focus on scenario types that reflect your current QA priorities.


  1. Test Types

Select which types of tests Treeify should generate. You can choose one or more depending on your QA scope.

Supported Types:

  • Functional
  • Compatibility
  • Performance
  • Security
  • API (Coming Soon)

Treeify will structure test scenarios to include validations, flows, and paths that reflect the needs of these test types.


  1. Generation Strategy

This controls how conservatively or creatively the AI generates results based on the input.

  • Strict Mode
    • AI generates only from the explicitly provided requirement content
    • No inference or assumption
    • Ensures maximum alignment and traceability
  • Complement Mode
    • AI tries to expand or fill in missing pieces based on typical domain behavior
    • Useful when the input is partial or high-level
    • Enables broader scenario discovery

Use Strict mode for regulated projects or detailed specs; use Complement mode for early-stage or exploratory QA.

4. Test Case Design Workflow

Treeify's workflow consists of five main steps, more details see step-by-step Instructions:

4.1 Requirement Analysis:

In the Requirement Analysis phase, Treeify intelligently analyzes your input documents (PRDs, user stories, etc.) and decomposes them into structured testing elements across selected test dimensions.

🧠 Dimension-Based Analysis

Based on your selections in Project Settings, Treeify breaks down the requirements into multiple testing dimensions. Currently, the platform supports:

  • Functional
  • 🔌 API
  • 🔐 Security
  • ⚙️ Performance
  • 🌐 Compatibility

📌 Compliance testing will be available in an upcoming release.

📂 Automatic Field Completion with AI

Each testing item will be parsed into structured fields (e.g., function name, expected behavior, data flow).

If a specific field is missing from the input requirement, Treeify will auto-tag it with:

<Note: NOT found in PRD. Please review and supplement as needed>

This ensures traceability and transparency — nothing is hallucinated, and testers can easily identify gaps.

✏️ Manual Review and Edit

Each node in the requirement mind map is editable. Simply click on any node to refine its content, correct logic, or add missing context. Treeify supports full human-in-the-loop collaboration.

image.png

4.2 Test Object Generation:

In the Test Object Generation phase, Treeify transforms analyzed requirements into structured test objects — the fundamental building blocks for test scenario design.

image.png

🔍 Aspect-Oriented Decomposition

For each requirement, Treeify's AI determines which aspects are relevant to focus on, depending on the nature of the requirement (e.g., functional, API, security).

For example, when handling functional requirements, Treeify may automatically segment the requirement into the following test object aspects:

  • 🧠 Business Function — Core logic or process logic to be validated
  • 🔄 Data Processing — How input, output, and transformations are handled
  • 👤 Interactivity — User interactions and behavior paths
  • 🎨 UI & UX — Visual layout, responsiveness, and experience elements
  • 🔐 Security (if relevant) — Input validation, permission enforcement, etc.

Each aspect is represented as a test object node in the mind map, allowing for clear focus and traceable scenario generation.

✏️ Fully Editable Nodes

Click on any test object node to inspect or modify its details — including test dimension, object name, scope, and assumptions. This supports iterative refinement and collaborative QA thinking.

image.png

4.3 Test Scenario Generation

In the Test Scenario Generation phase, Treeify’s fine-tuned scenario agent generates realistic, high-coverage test scenarios for each test object — combining industry best practices with internal QA expertise.

image.png

🎯 Comprehensive Coverage by Design

Each test scenario is derived from the type and context of the test object. For example, a business function object will be tested through multiple key lenses, including:

  • 🔐 Access Control — Who can trigger the function, under what roles/permissions
  • 🔁 Input-Output Logic — Handling of different input types, validation rules, and system responses
  • ⚠️ Exception Handling — Error paths, edge cases, and unexpected states
  • 📈 Business Coverage — Full process workflows, branching conditions, data variation
  • 🔄 Process Integrity — State transitions, data consistency, and reversibility

📝 Structured Test Case Format

Each scenario includes:

  • Scenario Name — Clear and structured format: [Action] - [Trigger Condition] - [Expected Outcome]
  • Description — Plain language summary of what’s being tested
  • Test Steps — Concrete steps a QA tester would follow
  • Expected Output — Clear, observable result of the scenario

All scenarios are organized into the mind map view, making the coverage visual, traceable, and easy to edit.

✏️ Note-Based Re-Generation (Coming Soon)

Soon, users will be able to write natural-language notes on each node to provide corrections or preferences. The scenario agent will learn from these notes and regenerate the results accordingly — enabling human-AI collaborative improvement.

5. Export Test Case

Treeify supports two flexible export methods to help you seamlessly integrate generated test cases into your testing workflow:

🔹 Method 1: Export as a Local File

You can export your test cases as a file for local use or import into other test management systems.

✅ Supported Formats:

  • Excel (.xlsx) — Suitable for reviewing and editing test cases in spreadsheets or importing into compatible tools.
  • JSON (.json) — Structured machine-readable format, useful for integrations or advanced processing.

📌 Steps to Export:

  1. After completing the Test Scenario Generation step, click the Export button.
  2. In the popup:
    • Select Mind Map (e.g., Test Object Generation or Test Scenario Analysis).
    • Choose Export As File.
    • Select File Format: Excel or JSON.
  3. Click OK to download the file.

image.png

🔹 Method 2: Export Directly to TestCaseLab (via API)

Treeify provides native integration with TestCaseLab, allowing you to push test cases directly into your existing project for further test execution and management.

🧾 Required Information:

To enable API export, you must fill in the following fields (provided by your TestCaseLab account):

  • API Token
  • Company ID
  • Project ID

These credentials allow Treeify to authenticate and push test cases into the correct TestCaseLab project.

📌 Steps to Export:

  1. Click the Export button after test case generation.

  2. In the popup:

    • Select Mind Map to export (e.g., Test Scenario Analysis).
    • Choose Export via API.
    • Select Third Party AppTestCaseLab.
    • Fill in your API Token, Company ID, and Project ID.
  3. Click OK.

  4. If the information is valid, Treeify will upload the test cases directly to your TestCaseLab project.

    image.png

💡 Notes:

  • Your API Token, Company ID, and Project ID are stored locally and securely after the first successful input — no need to re-enter each time.
  • Only completed mind maps (Requirement Analysis, Test Object Analysis, or Test Scenario Generation) can be exported.
  • Ensure your TestCaseLab account has sufficient permissions to create test cases in the selected project.

User Roles and Permissions

Coming Soon...