2025-05-01 19:16:37 +02:00
..
2025-05-01 16:09:02 +02:00
2025-04-29 18:26:13 +02:00
2025-05-01 19:16:37 +02:00
2025-04-29 18:26:13 +02:00
2025-04-29 18:26:13 +02:00
2025-04-29 18:26:13 +02:00
2025-04-29 18:26:13 +02:00
2025-04-29 18:26:13 +02:00

# Asset Processor Tool vX.Y

Table of Contents

Overview

This tool processes 3D asset source files (texture sets, models, etc., provided as ZIP archives or folders) into a standardized library format. It uses configurable presets to interpret different asset sources and automates tasks like file classification, image resizing, channel merging, and metadata generation.

The tool offers both a Graphical User Interface (GUI) for interactive use and a Command-Line Interface (CLI) for batch processing and scripting.

(Note: This tool is currently work in progress, rewriting features from an original proof of concept. The original script can be found at Deprecated-POC/ for reference.)

Features

Core Processing & Classification

  • Preset-Driven: Uses JSON presets (presets/) to define rules for different asset suppliers (e.g., Poliigon.json).
  • Multi-Asset Input Handling: Correctly identifies and processes multiple distinct assets contained within a single input .zip or folder, creating separate outputs for each.
  • File Classification: Automatically identifies map types (Color, Normal, Roughness, etc.), models, explicitly marked extra files, and unrecognised files based on preset rules.
    • Variant Handling: Map types listed in RESPECT_VARIANT_MAP_TYPES (in config.py, e.g., "COL") always receive a numeric suffix (-1, -2, etc.). Numbering priority uses preset keyword order first, then alphabetical filename sorting as a tie-breaker. Other map types never receive a suffix. (See Architecture for details).
    • 16-bit Prioritization: Correctly identifies and prioritizes 16-bit variants defined in preset bit_depth_variants (e.g., *_NRM16.tif), ignoring the corresponding 8-bit version (marked as Ignored in GUI).
  • Map Processing:
    • Resizes texture maps to configured resolutions (e.g., 4K, 2K, 1K), avoiding upscaling.
    • Handles Glossiness map inversion to Roughness.
    • Applies bit-depth rules (respect source or force_8bit).
    • Saves maps in appropriate formats (JPG, PNG, EXR) based on complex rules involving map type (FORCE_LOSSLESS_MAP_TYPES), resolution (RESOLUTION_THRESHOLD_FOR_JPG), bit depth, and source format. (See Architecture for details).
    • Calculates basic image statistics (Min/Max/Mean) for a reference resolution.
    • Calculates and stores the relative aspect ratio change string in metadata (e.g., EVEN, X150, Y125).
  • Channel Merging: Combines channels from different maps into packed textures (e.g., NRMRGH) based on preset rules (MAP_MERGE_RULES in config.py).
  • Metadata Generation: Creates a metadata.json file for each asset containing details about maps, category, archetype, aspect ratio change, processing settings, etc.
  • Output Organization: Creates a clean, structured output directory (<output_base>/<supplier>/<asset_name>/).
  • Optimized Classification: Pre-compiles regular expressions from presets for faster file identification.

Interface & Automation

  • Dual Interface: Provides both a user-friendly GUI and a powerful CLI.
  • Parallel Processing: Utilizes multiple CPU cores for faster processing (configurable via --workers in CLI or GUI control).
  • Skip/Overwrite: Can skip processing if the output already exists or force reprocessing (--overwrite flag / checkbox).
  • Directory Monitor: Includes monitor.py script for automated processing of assets dropped into a watched folder.
  • Responsive GUI: Uses background threads (QThread, ProcessPoolExecutor, ThreadPoolExecutor) to keep the UI responsive during intensive operations.
  • GUI Features:
    • Drag-and-drop input (ZIPs/folders).
    • Integrated preset editor panel.
    • Configurable output directory field with browse button.
    • Enhanced live preview table showing predicted file status.
    • Toggleable preview mode (detailed file list vs simple asset list).
    • Toggleable log console panel.
    • Progress bar, cancellation button, clear queue button.

Integration

  • Blender Integration: Optionally runs Blender scripts (create_nodegroups.py, create_materials.py) after asset processing to automate node group and material creation in specified .blend files. Available via both CLI and GUI.
    • GUI Controls: Checkbox to enable/disable Blender integration and input fields with browse buttons for target .blend files.
  • Docker Support: Includes a Dockerfile for containerized execution.

Directory Structure

Asset_processor_tool/ │ ├── main.py # CLI Entry Point & processing orchestrator ├── monitor.py # Directory monitoring script for automated processing ├── asset_processor.py # Core class handling single asset processing pipeline ├── configuration.py # Class for loading and accessing configuration ├── config.py # Core settings definition (output paths, resolutions, merge rules etc.) │ ├── blenderscripts/ # Scripts for integration with Blender │ └── create_nodegroups.py # Script to create node groups from processed assets │ └── create_materials.py # Script to create materials linking to node groups │ ├── gui/ # Contains files related to the Graphical User Interface │ ├── main_window.py # Main GUI application window and layout │ ├── processing_handler.py # Handles background processing logic for the GUI │ ├── prediction_handler.py # Handles background file prediction/preview for the GUI │ └── ... # Other GUI components │ ├── Presets/ # Preset definition files │ ├── _template.json # Template for creating new presets │ └── Poliigon.json # Example preset for Poliigon assets │ └── ... # Other presets │ ├── Testfiles/ # Directory containing example input assets for testing │ ├── Tickets/ # Directory for issue and feature tracking (Markdown files) │ ├── _template.md # Template for creating new tickets │ └── Ticket-README.md # Explanation of the ticketing system │ ├── requirements.txt # Python package dependencies for standard execution ├── requirements-docker.txt # Dependencies specifically for the Docker environment ├── Dockerfile # Instructions for building the Docker container image └── README.md # This documentation file (you are here!)

Architecture

(This section details the internal design for those interested in the technical implementation or contributing.)

Core Components

  • config.py: Defines core, global settings (output paths, resolutions, format rules, Blender paths, etc.).
  • Presets/*.json: Supplier-specific JSON files defining rules (filename patterns, map keywords, etc.).
  • configuration.py (Configuration class): Loads config.py settings and merges them with a selected preset JSON. Pre-compiles regex patterns from the preset for faster classification.
  • asset_processor.py (AssetProcessor class): Contains the core logic for processing a single asset, orchestrating the pipeline steps.
  • main.py: Entry point for the CLI. Handles argument parsing, logging, parallel processing pool (ProcessPoolExecutor), calls AssetProcessor via a wrapper, and optionally triggers Blender scripts.
  • gui/: Modules for the GUI (built with PySide6).
  • monitor.py: Implements directory monitoring using watchdog.

Parallel Processing (CLI & GUI)

Uses Python's concurrent.futures.ProcessPoolExecutor managed by main.py (CLI) or gui/processing_handler.py (GUI). Each asset is processed in a separate worker process via the main.process_single_asset_wrapper function, which instantiates Configuration and AssetProcessor for isolation. Results are communicated back to the main process.

Asset Processing Pipeline (AssetProcessor class)

(Note: The <details> tag below creates a collapsible section on platforms like GitHub. It may not render correctly everywhere.)

<details> <summary>Click to expand detailed Asset Processing Pipeline Steps</summary>

The AssetProcessor class executes these steps for each asset:

  1. _setup_workspace(): Creates a temporary directory.
  2. _extract_input(): Extracts/copies input to the workspace.
  3. _inventory_and_classify_files(): Scans workspace and classifies files using pre-compiled regex from the loaded Configuration.
    • Identifies Extra/ files, model files, texture maps.
    • Prioritizes 16-bit variants (e.g., *_NRM16.tif) based on bit_depth_variants patterns, tracking ignored 8-bit counterparts.
    • Assigns suffixes (-1, -2) to map variants based on RESPECT_VARIANT_MAP_TYPES (in config.py) and preset keyword order (map_type_mapping).
    • Classifies remaining files as 'Unrecognised'.
  4. _determine_base_metadata(): Determines asset name, category (Texture, Asset, Decal), and archetype (e.g., Wood, Metal) using preset rules (source_naming, asset_category_rules, archetype_rules).
  5. Skip Check: If overwrite is false, checks for existing output directory and metadata.json. Stops if found.
  6. _process_maps(): Processes classified texture maps.
    • Loads image (handles Gloss->Roughness inversion).
    • Resizes to target resolutions (no upscaling).
    • Determines output bit depth (MAP_BIT_DEPTH_RULES).
    • Determines output format (.jpg, .png, .exr) based on:
      • FORCE_LOSSLESS_MAP_TYPES list (overrides all other logic).
      • RESOLUTION_THRESHOLD_FOR_JPG (forces JPG for large 8-bit maps).
      • Source format, target bit depth, configured defaults (OUTPUT_FORMAT_16BIT_PRIMARY, OUTPUT_FORMAT_8BIT).
    • Saves processed maps with fallback logic (e.g., EXR -> PNG if primary fails).
    • Calculates stats (Min/Max/Mean) for CALCULATE_STATS_RESOLUTION.
    • Determines aspect ratio change string (e.g., "EVEN", "X150", "Y075").
  7. _merge_maps(): Combines channels based on MAP_MERGE_RULES (in config.py). Output format determined similarly to _process_maps.
  8. _generate_metadata_file(): Writes all collected info (maps, resolutions, stats, aspect ratio change, etc.) to metadata.json.
  9. _organize_output_files(): Moves processed maps, merged maps, models, metadata, Extra/, Unrecognised/, Ignored/ files to the final structured output directory.
  10. _cleanup_workspace(): Removes the temporary workspace.

GUI Architecture (gui/)

  • Framework: PySide6 (Qt for Python).
  • Main Window (main_window.py): Contains preset editor, processing panel (drag-and-drop, output selection, preview table, controls), Blender integration controls, and menus (View menu toggles Log Console / Detailed Preview).
  • Threading: Uses QThread for background tasks:
    • ProcessingHandler (processing_handler.py): Runs main processing pipeline (ProcessPoolExecutor) and Blender scripts in the background.
    • PredictionHandler (prediction_handler.py): Generates file previews (ThreadPoolExecutor) by calling AssetProcessor.get_detailed_file_predictions() in the background.
  • Communication: Uses Qt signals and slots between background threads and the main UI thread. A custom QtLogHandler redirects logs to the UI console.
  • Preset Editor: Allows interactive creation, modification, and saving of preset .json files.

Monitor Architecture (monitor.py)

  • Watching: Uses watchdog (PollingObserver) to monitor INPUT_DIR.
  • Handling: Custom ZipHandler detects on_created events for .zip files.
  • Parsing: Expects [preset]_filename.zip format, uses PRESET_FILENAME_REGEX to extract preset.
  • Validation: Checks if preset matches a file in Presets/.
  • Trigger: Calls main.run_processing if valid.
  • Management: Moves source .zip to PROCESSED_DIR (success/skip) or ERROR_DIR (failure/invalid preset).

Error Handling

  • Uses custom exceptions (ConfigurationError, AssetProcessingError).
  • Uses standard Python logging throughout. Log levels are configurable.
  • Worker processes report exceptions back to the main process.

Requirements

  • Python 3.8+
  • Required Python Packages (see requirements.txt):
    • opencv-python (image processing)
    • numpy (numerical operations)
    • PySide6 (for the GUI only)
    • watchdog (for the directory monitor)
  • Optional Python Packages:
    • OpenEXR (improved .exr handling, recommended if processing EXR sources)
  • Blender: A working installation is required for optional Blender integration. Configure the path in config.py or ensure it's in the system PATH.

Install dependencies using pip:

# Install core dependencies
pip install -r requirements.txt

# If using the GUI, ensure PySide6 is installed
# (It might be included in requirements.txt, or install separately)
# pip install PySide6

# If using the monitor script, ensure watchdog is installed
# (It might be included in requirements.txt, or install separately)
# pip install watchdog

Configuration

Tool behavior is controlled by:

1. Core Settings (config.py)

Defines global settings:

  • OUTPUT_BASE_DIR: Default root output directory.
  • DEFAULT_ASSET_CATEGORY: Fallback category ("Texture", "Asset", "Decal").
  • IMAGE_RESOLUTIONS: Dictionary mapping names to pixel dimensions (e.g., "4K": [4096, 4096]).
  • RESPECT_VARIANT_MAP_TYPES: List of map types (e.g., ["COL"]) that always get numeric suffixes.
  • TARGET_FILENAME_PATTERN: Format string for output filenames.
  • MAP_MERGE_RULES: List defining channel merges (e.g., creating NRMRGH).
  • ARCHETYPE_RULES: Rules for determining asset archetype (e.g., Wood, Metal).
  • RESOLUTION_THRESHOLD_FOR_JPG: Size threshold above which 8-bit maps are forced to JPG.
  • FORCE_LOSSLESS_MAP_TYPES: List of map types (e.g., ["NRM", "DISP"]) always saved losslessly (PNG/EXR).
  • BLENDER_EXECUTABLE_PATH: Path to Blender (for integration).
  • DEFAULT_NODEGROUP_BLEND_PATH: Default .blend for node groups (GUI).
  • DEFAULT_MATERIALS_BLEND_PATH: Default .blend for materials (GUI).
  • ... and other parameters (JPEG quality, PNG compression, output formats, etc.).

2. Preset Files (presets/*.json)

Define supplier-specific rules. Use presets/_template.json as a base.

  • supplier_name: Source name (e.g., "Poliigon").
  • map_type_mapping: List mapping source patterns to standard types (e.g., {"target_type": "COL", "keywords": ["_col*", "_color"]}). Keyword order determines variant numbering priority.
  • bit_depth_variants: Maps types to patterns for high bit-depth files (e.g., "NRM": "*_NRM16*.tif"). These take priority.
  • map_bit_depth_rules: How to handle source bit depth (e.g., {"default": "respect", "overrides": {"NRM": "force_8bit"}}). Needs update based on actual implementation detail if different from example
  • model_patterns: Regex for model files (e.g., ".*\\.fbx", ".*\\.obj").
  • move_to_extra_patterns: Regex for files to move directly to Extra/.
  • source_naming_convention: Rules for extracting base name/archetype from source filenames.
  • asset_category_rules: Keywords/patterns for specific categories (e.g., "Decal").

Usage

1. Graphical User Interface (GUI)

  • Run (from project root):
    python -m gui.main_window
    
  • Interface:
    • Menu Bar: "View" menu toggles Log Console and Detailed File Preview.
    • Preset Editor Panel (Left):
      • Optional Log Console: Displays application logs (toggle via View menu).
      • Preset List: Create, delete, load, edit, save presets. Select here to edit below.
      • Preset Editor Tabs: Edit selected preset details.
    • Processing Panel (Right):
      • Preset Selector: Choose preset for processing the current queue.
      • Output Directory: Set output path (defaults to config.py, use "Browse...").
      • Drag and Drop Area: Add asset .zip files or folders.
      • Preview Table: Shows queued assets. Mode depends on "View" menu:
        • Detailed Preview (Default): Lists all files, predicted status (Mapped, Model, Extra, Unrecognised, Ignored, Error), output name, etc., based on selected processing preset. Color-coded rows.
        • Simple View (Preview Disabled): Lists only top-level input asset paths.
      • Progress Bar: Overall processing progress.
      • Blender Post-Processing: Checkbox to enable Blender scripts. If enabled, shows fields and browse buttons for target .blend files (defaults from config.py).
      • Options & Controls (Bottom):
        • Overwrite Existing: Checkbox to force reprocessing.
        • Workers: Spinbox for concurrent processes.
        • Clear Queue: Button to clear the queue and preview.
        • Start Processing: Button to start processing the queue.
        • Cancel: Button to attempt stopping processing.
    • Status Bar: Displays current status, errors, completion messages.

2. Command-Line Interface (CLI)

  • Run:
    python main.py [OPTIONS] INPUT_PATH [INPUT_PATH ...]
    
  • Arguments:
    • INPUT_PATH: One or more paths to input .zip files or folders.
  • Options:
    • -p PRESET, --preset PRESET: (Required) Name of the preset (e.g., Poliigon).
    • -o OUTPUT_DIR, --output-dir OUTPUT_DIR: Override OUTPUT_BASE_DIR from config.py.
    • -w WORKERS, --workers WORKERS: Number of parallel processes (default: auto).
    • --overwrite: Force reprocessing and overwrite existing output.
    • -v, --verbose: Enable detailed DEBUG level logging.
    • --nodegroup-blend NODEGROUP_BLEND: Path to .blend for node groups. Triggers script if provided. Overrides config.py.
    • --materials-blend MATERIALS_BLEND: Path to .blend for materials. Triggers script if provided. Overrides config.py.
  • Example:
    python main.py "C:/Downloads/WoodFine001.zip" -p Poliigon -o "G:/Assets/Processed" --workers 4 --overwrite --nodegroup-blend "G:/Blender/Libraries/NodeGroups.blend" --materials-blend "G:/Blender/Libraries/Materials.blend"
    

3. Directory Monitor (Automated Processing)

  • Run:
    python monitor.py
    
  • Functionality: Monitors input directory for new .zip files named [preset]_filename.zip. Automatically processes them using the extracted preset. Moves source zip to processed/error directory.
  • Note: Does not currently support optional Blender script execution (only CLI/GUI).
  • Configuration (Environment Variables):
    • INPUT_DIR: Directory to monitor (default: /data/input).
    • OUTPUT_DIR: Base output directory (default: /data/output).
    • PROCESSED_DIR: Dir for successful source ZIPs (default: /data/processed).
    • ERROR_DIR: Dir for failed source ZIPs (default: /data/error).
    • LOG_LEVEL: Logging verbosity (INFO, DEBUG) (default: INFO).
    • POLL_INTERVAL: Check frequency (seconds) (default: 5).
    • PROCESS_DELAY: Delay before processing detected file (seconds) (default: 2).
    • NUM_WORKERS: Number of parallel workers (default: auto).
  • Output: Logs to console. Creates assets in OUTPUT_DIR. Moves source .zip.

4. Blender Node Group Creation Script (blenderscripts/create_nodegroups.py)

  • Purpose: Creates/updates PBR node groups in a .blend file from processed assets. Designed to be run within Blender.
  • Execution: Typically triggered by Asset Processor (CLI/GUI). Can be run manually in Blender's Text Editor.
  • Prerequisites (for manual run):
    • Processed asset library at a known path.
    • Target .blend file containing template node groups Template_PBRSET and Template_PBRTYPE.
  • Configuration (for manual run - update inside script):
    • PROCESSED_ASSET_LIBRARY_ROOT: Path to the base output directory containing supplier folders (e.g., Poliigon/). (Overridden by CLI/GUI).
  • Functionality: Reads metadata.json, creates/updates node groups, loads textures, sets up nodes, applies metadata settings (aspect ratio, stats, resolution), sets previews. Saves the .blend file.

5. Blender Material Creation Script (blenderscripts/create_materials.py)

  • Purpose: Creates/updates materials in a .blend file that link to the PBRSET node groups created by create_nodegroups.py. Designed to be run within Blender.
  • Execution: Typically triggered by Asset Processor (CLI/GUI). Can be run manually in Blender's Text Editor.
  • Prerequisites (for manual run):
    • Processed asset library at a known path.
    • The .blend file containing the PBRSET node groups.
    • A template material named Template_PBRMaterial in the current .blend file containing a Group node labeled PLACEHOLDER_NODE_LABEL.
  • Configuration (for manual run - update inside script):
    • PROCESSED_ASSET_LIBRARY_ROOT: Path to the base output directory. (Overridden by CLI/GUI).
    • NODEGROUP_BLEND_FILE_PATH: Path to the .blend file containing PBRSET node groups. (Overridden by CLI/GUI).
    • Constants like TEMPLATE_MATERIAL_NAME, PLACEHOLDER_NODE_LABEL can be adjusted.
  • Functionality: Reads metadata, copies template material, links corresponding PBRSET node group from the specified library file, marks as asset, copies tags, sets previews, sets viewport properties. Saves the .blend file.

Processing Pipeline (Simplified)

  1. Extraction: Input copied/extracted to temporary workspace.
  2. Classification: Files classified (map, model, extra, ignored, unrecognised) using preset rules.
  3. Metadata Determination: Asset name, category, archetype identified.
  4. Skip Check: Stop if output exists and --overwrite is false.
  5. Map Processing: Resize, convert format/bit depth, save maps. Invert Gloss if needed. Calculate stats.
  6. Merging: Merge channels based on rules, save merged maps.
  7. Metadata Generation: Create metadata.json.
  8. Output Organization: Move processed files to final output directory.
  9. Cleanup: Remove temporary workspace.
  10. (Optional) Blender Scripts: If triggered (CLI/GUI), run create_nodegroups.py and create_materials.py via Blender executable.

Output Structure

Processed assets are saved to: <output_base_directory>/<supplier_name>/<asset_name>/

Each asset directory contains:

  • Processed texture maps (e.g., AssetName_Color_4K.png, AssetName_NRM_2K.exr).
  • Merged texture maps (e.g., AssetName_NRMRGH_4K.png).
  • Model files (if present).
  • metadata.json: Details about the asset and processing.
  • Extra/ (subdirectory): Source files not classified as maps/models (e.g., previews, docs, unrecognised files).
  • Ignored/ (subdirectory): Source files explicitly ignored during processing (e.g., 8-bit NRM when a 16-bit variant exists).

Docker

A Dockerfile and requirements-docker.txt are provided. Build and run using standard Docker commands for a containerized environment (primarily for CLI or monitor usage).

# Example build command (run from project root)
docker build -t asset-processor-tool .

# Example run command (adjust volumes as needed)
# docker run -v /path/to/your/inputs:/data/input -v /path/to/your/outputs:/data/output asset-processor-tool python main.py /data/input/YourAsset.zip -p YourPreset