Initial commit
This commit is contained in:
46
Documentation/02_Developer_Guide/01_Architecture.md
Normal file
46
Documentation/02_Developer_Guide/01_Architecture.md
Normal file
@@ -0,0 +1,46 @@
|
||||
# Developer Guide: Architecture
|
||||
|
||||
This document provides a high-level overview of the Asset Processor Tool's architecture and core components for developers.
|
||||
|
||||
## High-Level Architecture
|
||||
|
||||
The Asset Processor Tool is designed to process 3D asset source files into a standardized library format. Its high-level architecture consists of:
|
||||
|
||||
1. **Core Processing Engine (`AssetProcessor`):** The central component responsible for orchestrating the asset processing pipeline for a single input asset.
|
||||
2. **Configuration System (`Configuration`):** Handles loading core settings and merging them with supplier-specific rules defined in JSON presets.
|
||||
3. **Multiple Interfaces:** Provides different ways to interact with the tool:
|
||||
* Graphical User Interface (GUI)
|
||||
* Command-Line Interface (CLI)
|
||||
* Directory Monitor for automated processing.
|
||||
These interfaces exchange data structures containing information about each file and asset set being processed.
|
||||
4. **Optional Integration:** Includes scripts and logic for integrating with external software, specifically Blender, to automate material and node group creation.
|
||||
|
||||
## Core Components
|
||||
|
||||
* `config.py`: Defines core, global settings and constants.
|
||||
* `Presets/*.json`: Supplier-specific JSON files defining rules for file interpretation and processing.
|
||||
* `configuration.py` (`Configuration` class): Loads `config.py` settings and merges them with a selected preset, pre-compiling regex patterns for efficiency.
|
||||
* `asset_processor.py` (`AssetProcessor` class): Contains the core logic for processing a *single* asset through the defined pipeline steps. This component works with data structures containing detailed information about the asset and its files.
|
||||
* `main.py`: The entry point for the Command-Line Interface (CLI). It handles argument parsing, logging, parallel processing orchestration, and triggering Blender scripts. It orchestrates the processing of multiple assets by interacting with the `AssetProcessor` for individual assets and manages the overall CLI execution flow.
|
||||
* `gui/`: Directory containing modules for the Graphical User Interface (GUI), built with PySide6. The GUI interacts with the core processing logic indirectly via dedicated handler classes (`ProcessingHandler`, `PredictionHandler`) running in separate threads. Data structures, such as input paths, configuration details, processing progress updates, and file prediction results, are passed between the GUI, these handlers, and the `AssetProcessor` using thread-safe mechanisms like Qt signals and slots.
|
||||
* `monitor.py`: Implements the directory monitoring feature using `watchdog`.
|
||||
* `blenderscripts/`: Contains Python scripts designed to be executed *within* Blender for post-processing tasks.
|
||||
|
||||
## Processing Pipeline (Simplified)
|
||||
|
||||
The core processing engine (`AssetProcessor`) executes a series of steps for each asset:
|
||||
|
||||
1. Extraction of input to a temporary workspace.
|
||||
2. Classification of files (map, model, extra, ignored, unrecognised) using preset rules.
|
||||
3. Determination of base metadata (asset name, category, archetype).
|
||||
4. Skip check if output exists and overwrite is not forced.
|
||||
5. Processing of maps (resize, format/bit depth conversion, inversion, stats calculation).
|
||||
6. Merging of channels based on rules.
|
||||
7. Generation of `metadata.json` file.
|
||||
8. Organization of processed files into the final output structure.
|
||||
9. Cleanup of the temporary workspace.
|
||||
10. (Optional) Execution of Blender scripts for post-processing.
|
||||
|
||||
This architecture allows for a modular design, separating configuration, core processing logic, and different interfaces. The data structures flowing through this pipeline carry detailed information about individual files and asset sets. Examples include lists of input file paths, configuration objects, dictionaries summarizing processing outcomes, and detailed lists of file predictions. Parallel processing is utilized for efficiency, and background threads keep the GUI responsive.
|
||||
|
||||
**Note on Data Passing:** Major changes to the data passing mechanisms between the GUI, Main (CLI orchestration), and `AssetProcessor` modules are currently being planned. These changes are expected to involve new data structures and updated interaction patterns to convey detailed specifications for datasets/asset-sets and processing instructions for individual files. The documentation in this section, particularly regarding data flow, will require significant review and updates once the plan for these changes is finalized.
|
||||
64
Documentation/02_Developer_Guide/02_Codebase_Structure.md
Normal file
64
Documentation/02_Developer_Guide/02_Codebase_Structure.md
Normal file
@@ -0,0 +1,64 @@
|
||||
# Developer Guide: Codebase Structure
|
||||
|
||||
This document outlines the key files and directories within the Asset Processor Tool project.
|
||||
|
||||
```
|
||||
Asset_processor_tool/
|
||||
├── asset_processor.py # Core class handling single asset processing pipeline
|
||||
├── config.py # Core settings definition (output paths, resolutions, merge rules etc.)
|
||||
├── configuration.py # Class for loading and accessing configuration (merges config.py and presets)
|
||||
├── detailed_documentation_plan.md # (Existing file, potentially outdated)
|
||||
├── Dockerfile # Instructions for building the Docker container image
|
||||
├── documentation_plan.md # Plan for the new documentation structure (this plan)
|
||||
├── documentation.txt # Original developer documentation (to be migrated)
|
||||
├── main.py # CLI Entry Point & processing orchestrator
|
||||
├── monitor.py # Directory monitoring script for automated processing
|
||||
├── readme.md # Original main documentation file (to be migrated)
|
||||
├── readme.md.bak # Backup of readme.md
|
||||
├── requirements-docker.txt # Dependencies specifically for the Docker environment
|
||||
├── requirements.txt # Python package dependencies for standard execution
|
||||
├── blenderscripts/ # Scripts for integration with Blender
|
||||
│ ├── create_materials.py # Script to create materials linking to node groups
|
||||
│ └── create_nodegroups.py # Script to create node groups from processed assets
|
||||
├── Deprecated-POC/ # Directory containing original proof of concept scripts
|
||||
│ ├── Blender-MaterialsFromNodegroups.py
|
||||
│ ├── Blender-NodegroupsFromPBRSETS.py
|
||||
│ └── Standalonebatcher-Main.py
|
||||
├── Documentation/ # New directory for organized documentation (this structure)
|
||||
│ ├── 00_Overview.md
|
||||
│ ├── 01_User_Guide/
|
||||
│ └── 02_Developer_Guide/
|
||||
├── gui/ # Contains files related to the Graphical User Interface
|
||||
│ ├── main_window.py # Main GUI application window and layout
|
||||
│ ├── processing_handler.py # Handles background processing logic for the GUI
|
||||
│ ├── prediction_handler.py # Handles background file prediction/preview for the GUI
|
||||
│ ├── preview_table_model.py # Model and proxy for the GUI's preview table
|
||||
│ └── ... # Other GUI components
|
||||
├── Presets/ # Preset definition files
|
||||
│ ├── _template.json # Template for creating new presets
|
||||
│ ├── Poliigon.json # Example preset for Poliigon assets
|
||||
│ └── ... # Other presets
|
||||
├── Project Notes/ # Directory for issue and feature tracking (Markdown files)
|
||||
│ ├── ... # Various planning and note files
|
||||
└── Testfiles/ # Directory containing example input assets for testing
|
||||
└── ... # Example asset ZIPs
|
||||
```
|
||||
|
||||
**Key Files and Directories:**
|
||||
|
||||
* `asset_processor.py`: Contains the `AssetProcessor` class, the core logic for processing a single asset through the pipeline. Includes methods for classification, map processing, merging, metadata generation, and output organization. Also provides methods for predicting output structure used by the GUI.
|
||||
* `configuration.py`: Defines the `Configuration` class. Responsible for loading core settings from `config.py` and merging them with a specified preset JSON file (`Presets/*.json`). Pre-compiles regex patterns from presets for efficiency.
|
||||
* `config.py`: Stores global default settings, constants, and core rules (e.g., standard map types, default resolutions, merge rules, output format rules, Blender paths).
|
||||
* `main.py`: Entry point for the Command-Line Interface (CLI). Handles argument parsing, logging setup, parallel processing orchestration (using `concurrent.futures.ProcessPoolExecutor`), calls `AssetProcessor` via a wrapper function, and optionally triggers Blender scripts.
|
||||
* `monitor.py`: Implements the automated directory monitoring feature using the `watchdog` library. Contains the `ZipHandler` class to detect new ZIP files and trigger processing via `main.run_processing`.
|
||||
* `gui/`: Directory containing all code related to the Graphical User Interface (GUI), built with PySide6.
|
||||
* `main_window.py`: Defines the `MainWindow` class, the main application window structure, UI layout, event handling, and menu setup. Manages GUI-specific logging (`QtLogHandler`).
|
||||
* `processing_handler.py`: Defines the `ProcessingHandler` class (runs on a `QThread`). Manages the execution of the main asset processing pipeline and Blender script execution in the background.
|
||||
* `prediction_handler.py`: Defines the `PredictionHandler` class (runs on a `QThread`). Manages background file analysis/preview generation.
|
||||
* `preview_table_model.py`: Defines `PreviewTableModel` and `PreviewSortFilterProxyModel` for managing and displaying data in the GUI's preview table.
|
||||
* `blenderscripts/`: Contains Python scripts (`create_nodegroups.py`, `create_materials.py`) designed to be executed *within* Blender for post-processing.
|
||||
* `Presets/`: Contains supplier-specific configuration files in JSON format.
|
||||
* `Testfiles/`: Contains example input assets for testing purposes.
|
||||
* `Tickets/`: Directory for issue and feature tracking using Markdown files.
|
||||
|
||||
**Note on Data Passing:** As mentioned in the Architecture documentation, major changes to the data passing mechanisms between the GUI, Main (CLI orchestration), and `asset_processor` modules are currently being planned. The descriptions of module interactions and data flow within this document reflect the current state and will require review and updates once the plan for these changes is finalized.
|
||||
70
Documentation/02_Developer_Guide/03_Key_Components.md
Normal file
70
Documentation/02_Developer_Guide/03_Key_Components.md
Normal file
@@ -0,0 +1,70 @@
|
||||
# Developer Guide: Key Components
|
||||
|
||||
This document describes the major classes and modules that form the core of the Asset Processor Tool.
|
||||
|
||||
## `AssetProcessor` (`asset_processor.py`)
|
||||
|
||||
The `AssetProcessor` class is the central engine of the tool. It is responsible for processing a *single* input asset (either a ZIP archive or a folder) through the entire pipeline. Its key responsibilities include:
|
||||
|
||||
* Setting up and cleaning up a temporary workspace for processing.
|
||||
* Extracting or copying input files to the workspace.
|
||||
* Inventorying and classifying files based on configured rules (maps, models, extra, ignored, unrecognised).
|
||||
* Determining asset metadata such as name, category, and archetype.
|
||||
* Processing texture maps (resizing, format/bit depth conversion, handling Gloss->Roughness inversion, calculating statistics).
|
||||
* Merging channels from different maps according to merge rules.
|
||||
* Generating the `metadata.json` file containing details about the processed asset.
|
||||
* Organizing the final output files into the structured library directory.
|
||||
* Providing methods (`get_detailed_file_predictions`) used by the GUI for previewing file classification.
|
||||
|
||||
## `Configuration` (`configuration.py`)
|
||||
|
||||
The `Configuration` class manages the tool's settings. It is responsible for:
|
||||
|
||||
* Loading the core default settings defined in `config.py`.
|
||||
* Loading the supplier-specific rules from a selected preset JSON file (`Presets/*.json`).
|
||||
* Merging the core settings and preset rules into a single, unified configuration object.
|
||||
* Validating the loaded configuration to ensure required settings are present.
|
||||
* Pre-compiling regular expression patterns defined in the preset for efficient file classification by the `AssetProcessor`.
|
||||
|
||||
An instance of the `Configuration` class is typically created once per asset processing task (within a worker process) to ensure isolated and correct settings for each asset.
|
||||
|
||||
## `MainWindow` (`gui/main_window.py`)
|
||||
|
||||
The `MainWindow` class is the main application window for the Graphical User Interface (GUI). It handles the overall UI layout and user interaction:
|
||||
|
||||
* Sets up the main window structure, including panels for the preset editor and processing controls.
|
||||
* Manages the layout of UI elements like the drag-and-drop area, preview table, buttons, and status bar.
|
||||
* Connects user actions (button clicks, drag/drop events) to corresponding handler methods (slots).
|
||||
* Interacts with background processing and prediction handlers (`ProcessingHandler`, `PredictionHandler`) via Qt signals and slots to update the UI safely from background threads.
|
||||
* Manages the GUI-specific logging handler (`QtLogHandler`) to display logs in the UI console.
|
||||
|
||||
## `ProcessingHandler` (`gui/processing_handler.py`)
|
||||
|
||||
The `ProcessingHandler` class is designed to run in a separate `QThread` within the GUI. Its purpose is to manage the execution of the main asset processing pipeline and optional Blender scripts in the background, preventing the GUI from freezing. It:
|
||||
|
||||
* Manages a `concurrent.futures.ProcessPoolExecutor` to run individual asset processing tasks (`AssetProcessor.process()`) in separate worker processes.
|
||||
* Submits processing tasks to the pool and monitors their completion.
|
||||
* Communicates progress, status updates, and results back to the `MainWindow` using Qt signals.
|
||||
* Handles the execution of Blender scripts via subprocess calls after asset processing is complete.
|
||||
* Provides logic for cancelling ongoing processing tasks (though cancellation of already running worker processes is not immediate).
|
||||
|
||||
## `PredictionHandler` (`gui/prediction_handler.py`)
|
||||
|
||||
The `PredictionHandler` class also runs in a separate `QThread` in the GUI. It is responsible for generating file classification previews in the background without blocking the UI. It:
|
||||
|
||||
* Calls methods on the `AssetProcessor` (specifically `get_detailed_file_predictions`) to analyze input files and predict their classification and output names based on the selected processing preset.
|
||||
* Uses a `ThreadPoolExecutor` for potentially concurrent prediction tasks.
|
||||
* Sends the prediction results back to the `MainWindow` via Qt signals to update the preview table.
|
||||
|
||||
## `ZipHandler` (`monitor.py`)
|
||||
|
||||
The `ZipHandler` is a custom event handler used by the `monitor.py` script, built upon the `watchdog` library. It is responsible for:
|
||||
|
||||
* Detecting file system events, specifically the creation of new `.zip` files, in the monitored input directory.
|
||||
* Validating the filename format of detected ZIPs to extract the intended preset name.
|
||||
* Triggering the main asset processing logic (`main.run_processing`) for valid new ZIP files.
|
||||
* Managing the movement of processed source ZIP files to 'processed' or 'error' directories.
|
||||
|
||||
These key components work together to provide the tool's functionality, separating concerns and utilizing concurrency for performance and responsiveness.
|
||||
|
||||
**Note on Data Passing:** As mentioned in the Architecture documentation, major changes to the data passing mechanisms between the GUI, Main (CLI orchestration), and `AssetProcessor` modules are currently being planned. The descriptions of module interactions and data flow within this document reflect the current state and will require review and updates once the plan for these changes is finalized.
|
||||
@@ -0,0 +1,44 @@
|
||||
# Developer Guide: Configuration System and Presets
|
||||
|
||||
This document provides technical details about the configuration system and the structure of preset files for developers working on the Asset Processor Tool.
|
||||
|
||||
## Configuration Flow
|
||||
|
||||
The tool utilizes a two-tiered configuration system:
|
||||
|
||||
1. **Core Settings (`config.py`):** This Python module defines global default settings, constants, and core rules that apply generally across different asset sources. Examples include default output paths, standard image resolutions, map merge rules, output format rules, Blender executable paths, and default map types.
|
||||
2. **Preset Files (`Presets/*.json`):** These JSON files define supplier-specific rules and overrides. They contain patterns (often regular expressions) to interpret filenames, classify map types, handle variants, define naming conventions, and specify other source-specific behaviors.
|
||||
|
||||
## `Configuration` Class (`configuration.py`)
|
||||
|
||||
The `Configuration` class is responsible for loading, merging, and preparing the configuration settings for use by the `AssetProcessor`.
|
||||
|
||||
* **Initialization:** An instance is created with a specific `preset_name`.
|
||||
* **Loading:**
|
||||
* It loads the core settings from `config.py` using `importlib.util`.
|
||||
* It loads the specified preset JSON file from the `Presets/` directory.
|
||||
* **Merging:** The loaded core settings and preset rules are merged into a single configuration object accessible via instance attributes. Preset values generally override core settings where applicable.
|
||||
* **Validation (`_validate_configs`):** Performs basic structural validation on the loaded settings, checking for the presence of required keys and basic data types (e.g., ensuring `map_type_mapping` is a list of dictionaries).
|
||||
* **Regex Compilation (`_compile_regex_patterns`):** A crucial step for performance. It iterates through the regex patterns defined in the preset (for extra files, models, bit depth variants, map keywords) and compiles them using `re.compile` (mostly case-insensitive). These compiled regex objects are stored as instance attributes (e.g., `self.compiled_map_keyword_regex`) for fast matching during file classification. It uses a helper (`_fnmatch_to_regex`) for basic wildcard (`*`, `?`) conversion in patterns.
|
||||
|
||||
An instance of `Configuration` is created within each worker process (`main.process_single_asset_wrapper`) to ensure that each concurrently processed asset uses the correct, isolated configuration based on the specified preset.
|
||||
|
||||
## Preset File Structure (`Presets/*.json`)
|
||||
|
||||
Preset files are the primary way to adapt the tool to new asset sources. Developers should use `Presets/_template.json` as a starting point. Key fields include:
|
||||
|
||||
* `supplier_name`: The name of the asset source (e.g., `"Poliigon"`). Used for output directory naming.
|
||||
* `map_type_mapping`: A list of dictionaries, each mapping source filename patterns/keywords to a standard internal map type (defined in `config.py`).
|
||||
* `target_type`: The standard internal map type (e.g., `"COL"`, `"NRM"`).
|
||||
* `keywords`: A list of filename patterns (regex or fnmatch-style wildcards) used to identify this map type. The order of keywords within this list, and the order of dictionaries in the `map_type_mapping` list, determines the priority for assigning variant suffixes (`-1`, `-2`, etc.) when multiple files match the same `target_type`.
|
||||
* `bit_depth_variants`: A dictionary mapping standard map types (e.g., `"NRM"`) to a pattern identifying its high bit-depth variant (e.g., `"*_NRM16*.tif"`). Files matching these patterns are prioritized over their standard counterparts.
|
||||
* `map_bit_depth_rules`: Defines how to handle the bit depth of source maps. Can specify a default behavior (`"respect"` or `"force_8bit"`) and overrides for specific map types.
|
||||
* `model_patterns`: A list of regex patterns to identify model files (e.g., `".*\\.fbx"`, `".*\\.obj"`).
|
||||
* `move_to_extra_patterns`: A list of regex patterns for files that should be moved directly to the `Extra/` output subdirectory without further processing.
|
||||
* `source_naming_convention`: Rules for extracting the base asset name and potentially the archetype from source filenames or directory structures (e.g., using separators and indices).
|
||||
* `asset_category_rules`: Keywords or patterns used to determine the asset category (e.g., identifying `"Decal"` based on keywords).
|
||||
* `archetype_rules`: Keywords or patterns used to determine the asset archetype (e.g., identifying `"Wood"` or `"Metal"`).
|
||||
|
||||
Careful definition of these patterns and rules, especially the regex in `map_type_mapping`, `bit_depth_variants`, `model_patterns`, and `move_to_extra_patterns`, is essential for correct asset processing.
|
||||
|
||||
**Note on Data Passing:** As mentioned in the Architecture documentation, major changes to the data passing mechanisms between the GUI, Main (CLI orchestration), and `AssetProcessor` modules are currently being planned. The descriptions of how configuration data is handled and passed within this document reflect the current state and will require review and updates once the plan for these changes is finalized.
|
||||
82
Documentation/02_Developer_Guide/05_Processing_Pipeline.md
Normal file
82
Documentation/02_Developer_Guide/05_Processing_Pipeline.md
Normal file
@@ -0,0 +1,82 @@
|
||||
# Developer Guide: Processing Pipeline
|
||||
|
||||
This document details the step-by-step technical process executed by the `AssetProcessor` class (`asset_processor.py`) when processing a single asset.
|
||||
|
||||
The `AssetProcessor.process()` method orchestrates the following pipeline:
|
||||
|
||||
1. **Workspace Setup (`_setup_workspace`)**:
|
||||
* Creates a temporary directory using `tempfile.mkdtemp()` to isolate the processing of the current asset.
|
||||
|
||||
2. **Input Extraction (`_extract_input`)**:
|
||||
* If the input is a supported archive type (.zip, .rar, .7z), it's extracted into the temporary workspace using the appropriate library (`zipfile`, `rarfile`, or `py7zr`).
|
||||
* If the input is a directory, its contents are copied into the temporary workspace.
|
||||
* Includes basic error handling for invalid or password-protected archives.
|
||||
|
||||
3. **File Inventory and Classification (`_inventory_and_classify_files`)**:
|
||||
* Scans the contents of the temporary workspace.
|
||||
* Uses the pre-compiled regex patterns from the loaded `Configuration` object to classify each file.
|
||||
* Classification follows a multi-pass approach for priority:
|
||||
* Explicitly marked `Extra/` files (using `move_to_extra_patterns` regex).
|
||||
* Model files (using `model_patterns` regex).
|
||||
* Potential Texture Maps (matching `map_type_mapping` keyword patterns).
|
||||
* Standalone 16-bit variants check (using `bit_depth_variants` patterns).
|
||||
* Prioritization of 16-bit variants over their 8-bit counterparts (marking the 8-bit version as `Ignored`).
|
||||
* Final classification of remaining potential maps.
|
||||
* Remaining files are classified as `Unrecognised` (and typically moved to `Extra/` later).
|
||||
* Stores the classification results (including source path, determined map type, potential variant suffix, etc.) in `self.classified_files`.
|
||||
* Sorts potential map variants based on preset rule order, keyword order within the rule, and finally alphabetical path to determine suffix assignment priority (`-1`, `-2`, etc.).
|
||||
|
||||
4. **Base Metadata Determination (`_determine_base_metadata`, `_determine_single_asset_metadata`)**:
|
||||
* Determines the base asset name using `source_naming_convention` rules from the `Configuration` (separators, indices), with fallbacks to common prefixes or the input name. Handles multiple distinct assets within a single input source.
|
||||
* Determines the asset category (`Texture`, `Asset`, `Decal`) based on the presence of model files or `decal_keywords` in the `Configuration`.
|
||||
* Determines the asset archetype (e.g., `Wood`, `Metal`) by matching keywords from `archetype_rules` (in `Configuration`) against file stems or the determined base name.
|
||||
* Stores this preliminary metadata.
|
||||
|
||||
5. **Skip Check**:
|
||||
* If the `overwrite` flag (passed during initialization) is `False`, the tool checks if the final output directory for the determined asset name already exists and contains a `metadata.json` file.
|
||||
* If both exist, processing for this specific asset is skipped, marked as "skipped", and the pipeline moves to the next asset (if processing multiple assets from one source) or finishes.
|
||||
|
||||
6. **Map Processing (`_process_maps`)**:
|
||||
* Iterates through the files classified as texture maps for the current asset.
|
||||
* Loads the image using `cv2.imread` (handling grayscale and unchanged flags). Converts BGR to RGB internally for consistency (except for saving non-EXR formats).
|
||||
* Handles Glossiness-to-Roughness inversion if necessary (loads gloss, inverts `1.0 - img/norm`, prioritizes gloss source if both exist).
|
||||
* Resizes the image to target resolutions defined in `IMAGE_RESOULTIONS` (from `Configuration`) using `cv2.resize` (`INTER_LANCZOS4` for downscaling). Upscaling is generally avoided by checks.
|
||||
* Determines the output bit depth based on `MAP_BIT_DEPTH_RULES` (`respect` vs `force_8bit`).
|
||||
* Determines the output file format (`.jpg`, `.png`, `.exr`) based on a hierarchy of rules:
|
||||
* `FORCE_LOSSLESS_MAP_TYPES` list (overrides other logic).
|
||||
* `RESOLUTION_THRESHOLD_FOR_JPG` (forces JPG for large 8-bit maps).
|
||||
* Source format, target bit depth, and configured defaults (`OUTPUT_FORMAT_16BIT_PRIMARY`, `OUTPUT_FORMAT_8BIT`).
|
||||
* Converts the NumPy array data type appropriately before saving (e.g., float to uint8/uint16 with scaling).
|
||||
* Saves the processed map using `cv2.imwrite` (converting RGB back to BGR if saving to non-EXR formats). Includes fallback logic (e.g., attempting PNG if saving 16-bit EXR fails).
|
||||
* Calculates image statistics (Min/Max/Mean) using `_calculate_image_stats` on normalized float64 data for the `CALCULATE_STATS_RESOLUTION`.
|
||||
* Determines the aspect ratio change string (e.g., `"EVEN"`, `"X150"`) using `_normalize_aspect_ratio_change`.
|
||||
* Stores details about each processed map (path, resolution, format, stats, etc.) in `processed_maps_details_asset`.
|
||||
|
||||
7. **Map Merging (`_merge_maps_from_source`)**:
|
||||
* Iterates through the `MAP_MERGE_RULES` defined in the `Configuration`.
|
||||
* Identifies the required *source* map files needed as input for each merge rule based on the classified files.
|
||||
* Determines common resolutions available across the required input maps.
|
||||
* Loads the necessary source map channels for each common resolution (using a helper `_load_and_transform_source` which includes caching).
|
||||
* Converts inputs to normalized float32 (0-1).
|
||||
* Injects default channel values (from rule `defaults`) if an input channel is missing.
|
||||
* Merges channels using `cv2.merge`.
|
||||
* Determines output bit depth and format based on rules (similar logic to `_process_maps`, considering input properties). Handles potential JPG 16-bit conflict by forcing 8-bit.
|
||||
* Saves the merged map using the `_save_image` helper (includes data type/color space conversions and fallback).
|
||||
* Stores details about each merged map in `merged_maps_details_asset`.
|
||||
|
||||
8. **Metadata File Generation (`_generate_metadata_file`)**:
|
||||
* Collects all determined information for the current asset: base metadata, details from `processed_maps_details_asset` and `merged_maps_details_asset`, list of ignored files, source preset used, etc.
|
||||
* Writes this collected data into the `metadata.json` file within the temporary workspace using `json.dump`.
|
||||
|
||||
9. **Output Organization (`_organize_output_files`)**:
|
||||
* Creates the final structured output directory: `<output_base_dir>/<supplier_name>/<asset_name>/`.
|
||||
* Creates subdirectories `Extra/`, `Unrecognised/`, and `Ignored/` within the asset directory.
|
||||
* Moves the processed maps, merged maps, model files, `metadata.json`, and files classified as Extra, Unrecognised, or Ignored from the temporary workspace into their respective locations in the final output directory structure.
|
||||
|
||||
10. **Workspace Cleanup (`_cleanup_workspace`)**:
|
||||
* Removes the temporary workspace directory and its contents using `shutil.rmtree()`. This is called within a `finally` block to ensure cleanup is attempted even if errors occur during processing.
|
||||
|
||||
11. **(Optional) Blender Script Execution**:
|
||||
* If triggered via CLI arguments (`--nodegroup-blend`, `--materials-blend`) or GUI controls, the orchestrator (`main.py` or `gui/processing_handler.py`) executes the corresponding Blender scripts (`blenderscripts/*.py`) using `subprocess.run` after the `AssetProcessor.process()` call completes successfully for an asset batch. See `Developer Guide: Blender Integration Internals` for more details.
|
||||
|
||||
**Note on Data Passing:** As mentioned in the Architecture documentation, major changes to the data passing mechanisms between the GUI, Main (CLI orchestration), and `AssetProcessor` modules are currently being planned. The descriptions of how data is processed and transformed within this pipeline reflect the current state and will require review and updates once the plan for these changes is finalized.
|
||||
105
Documentation/02_Developer_Guide/06_GUI_Internals.md
Normal file
105
Documentation/02_Developer_Guide/06_GUI_Internals.md
Normal file
@@ -0,0 +1,105 @@
|
||||
# Developer Guide: GUI Internals
|
||||
|
||||
This document provides technical details about the implementation of the Graphical User Interface (GUI) for developers.
|
||||
|
||||
## Framework
|
||||
|
||||
The GUI is built using `PySide6`, which provides Python bindings for the Qt framework.
|
||||
|
||||
## Main Window (`gui/main_window.py`)
|
||||
|
||||
The `MainWindow` class is the central component of the GUI application. It is responsible for:
|
||||
|
||||
* Defining the main application window structure and layout using PySide6 widgets.
|
||||
* Arranging the Preset Editor panel (left) and the Processing panel (right).
|
||||
* Setting up the menu bar, including the "View" menu for toggling the Log Console and Detailed File Preview.
|
||||
* Connecting user interactions (button clicks, drag-and-drop events, checkbox states, spinbox values) to corresponding methods (slots) within the `MainWindow` or other handler classes.
|
||||
* Managing the display of application logs in the UI console using a custom `QtLogHandler`.
|
||||
* Interacting with background handlers (`ProcessingHandler`, `PredictionHandler`) via Qt signals and slots to ensure thread-safe updates to the UI during long-running operations.
|
||||
|
||||
## Threading and Background Tasks
|
||||
|
||||
To keep the UI responsive during intensive operations like asset processing and file preview generation, the GUI utilizes background threads managed by `QThread`.
|
||||
|
||||
* **`ProcessingHandler` (`gui/processing_handler.py`):** This class is designed to run in a separate `QThread`. It manages the execution of the main asset processing pipeline for multiple assets concurrently using `concurrent.futures.ProcessPoolExecutor`. It submits individual asset processing tasks to the pool and monitors their completion. It uses Qt signals to communicate progress updates, file status changes, and overall processing completion back to the `MainWindow` on the main UI thread. It also handles the execution of optional Blender scripts via subprocess calls after processing. This handler processes and utilizes data structures received from the core processing engine, such as status summaries.
|
||||
* **`PredictionHandler` (`gui/prediction_handler.py`):** This class also runs in a separate `QThread`. It is responsible for generating the detailed file classification previews displayed in the preview table. It calls methods on the `AssetProcessor` (`get_detailed_file_predictions`) to perform the analysis in the background. It uses a `ThreadPoolExecutor` for potentially concurrent prediction tasks. Results are sent back to the `MainWindow` via Qt signals to update the preview table data. This handler works with data structures containing file prediction details.
|
||||
|
||||
## Communication (Signals and Slots)
|
||||
|
||||
Communication between the main UI thread (`MainWindow`) and the background threads (`ProcessingHandler`, `PredictionHandler`) relies heavily on Qt's signals and slots mechanism. This is a thread-safe way for objects in different threads to communicate.
|
||||
|
||||
* Background handlers emit signals to indicate events (e.g., progress updated, file status changed, task finished).
|
||||
* The `MainWindow` connects slots (methods) to these signals. When a signal is emitted, the connected slot is invoked on the thread that owns the receiving object (the main UI thread for `MainWindow`), ensuring UI updates happen safely.
|
||||
|
||||
## Preset Editor
|
||||
|
||||
The GUI includes an integrated preset editor panel. This allows users to interactively create, load, modify, and save preset `.json` files directly within the application. The editor typically uses standard UI widgets to display and edit the key fields of the preset structure.
|
||||
|
||||
## Preview Table
|
||||
|
||||
|
||||
The `PreviewTableModel` receives a list of file prediction dictionaries from the `PredictionHandler` via the `prediction_results_ready` signal. This list contains dictionaries for each file with details such as original path, predicted asset name, status, and other relevant information.
|
||||
|
||||
The `PreviewTableModel` is designed to process and display this file prediction data. Instead of directly displaying the flat list, it processes and transforms the data into a structured list of rows (`self._table_rows`). This transformation involves:
|
||||
|
||||
1. **Grouping:** Files are grouped based on their `source_asset`.
|
||||
2. **Separation:** Within each asset group, files are separated into `main_files` (Mapped, Model, Error) and `additional_files` (Ignored, Extra, Unrecognised, Unmatched Extra).
|
||||
3. **Structuring Rows:** Rows are created for `self._table_rows` to represent the grouped data. Each row can contain information about a main file and/or an additional file, allowing for the display of additional files in a separate column aligned with the main files of the same asset. Empty rows are created if there are more additional files than main files for an asset to maintain alignment.
|
||||
|
||||
The `data()` method of the `PreviewTableModel` then accesses this structured `self._table_rows` list to provide data to the `QTableView` for display. It handles different columns and roles (Display, Tooltip, Foreground, and Background).
|
||||
|
||||
* `Qt.ItemDataRole.ForegroundRole`: Used to set the text color of individual cells based on the status of the file they represent. Coloring is applied to cells corresponding to a main file based on the main file's status, and to cells in the "Additional Files" column based on the additional file's status.
|
||||
* `Qt.ItemDataRole.BackgroundRole`: Used to provide alternating background colors based on the index of the asset group the row belongs to in a sorted list of unique assets, improving visual separation between different asset groups.
|
||||
|
||||
The `PreviewSortFilterProxyModel` operates on this structured data, implementing a multi-level sort based on source asset, row type (main vs. additional-only), and file paths within those types.
|
||||
|
||||
|
||||
### Preview Table Column Configuration
|
||||
|
||||
The display and behavior of the columns in the `QTableView` are configured in `gui/main_window.py`. The current configuration is as follows:
|
||||
|
||||
* **Column Order (from left to right):**
|
||||
1. Status
|
||||
2. Predicted Asset
|
||||
3. Details
|
||||
4. Original Path
|
||||
5. Additional Files
|
||||
* **Column Resizing:**
|
||||
* Status: Resizes to content.
|
||||
* Predicted Asset: Resizes to content.
|
||||
* Details: Resizes to content.
|
||||
* Original Path: Resizes to content (fixed width behavior).
|
||||
* Additional Files: Stretches to fill available space.
|
||||
|
||||
**Data Flow Diagram:**
|
||||
|
||||
```mermaid
|
||||
graph LR
|
||||
A[PredictionHandler] -- prediction_results_ready(flat_list) --> B(PreviewTableModel);
|
||||
subgraph PreviewTableModel
|
||||
C[set_data] -- Processes flat_list --> D{Internal Grouping & Transformation};
|
||||
D -- Creates --> E[_table_rows (Structured List)];
|
||||
F[data()] -- Reads from --> E;
|
||||
end
|
||||
B -- Provides data via data() --> G(QTableView via Proxy);
|
||||
|
||||
style B fill:#f9f,stroke:#333,stroke-width:2px
|
||||
style C fill:#ccf,stroke:#333,stroke-width:1px
|
||||
style D fill:#lightgrey,stroke:#333,stroke-width:1px
|
||||
style E fill:#ccf,stroke:#333,stroke-width:1px
|
||||
style F fill:#ccf,stroke:#333,stroke-width:1px
|
||||
```
|
||||
|
||||
### Application Styling
|
||||
|
||||
The application style is explicitly set to 'Fusion' in `gui/main_window.py` to provide a more consistent look and feel across different operating systems, particularly to address styling inconsistencies observed on Windows 11. A custom `QPalette` is also applied to the application to adjust default colors within the 'Fusion' style, specifically to change the background color of list-like widgets and potentially other elements from a default dark blue to a more neutral grey.
|
||||
|
||||
## Logging
|
||||
|
||||
A custom `QtLogHandler` is used to redirect log messages from the standard Python `logging` module to a text area or console widget within the GUI, allowing users to see detailed application output and errors.
|
||||
|
||||
## Cancellation
|
||||
|
||||
The GUI provides a "Cancel" button to stop ongoing processing. The `ProcessingHandler` implements logic to handle cancellation requests. This typically involves setting an internal flag and attempting to shut down the `ProcessPoolExecutor`. However, it's important to note that this does not immediately terminate worker processes that are already executing; it primarily prevents new tasks from starting and stops processing results from completed futures once the cancellation flag is checked.
|
||||
|
||||
**Note on Data Passing:** As mentioned in the Architecture documentation, major changes to the data passing mechanisms between the GUI, Main (CLI orchestration), and `AssetProcessor` modules are currently being planned. The descriptions of how data is handled and passed within the GUI and its interactions with background handlers reflect the current state and will require review and updates once the plan for these changes is finalized.
|
||||
33
Documentation/02_Developer_Guide/07_Monitor_Internals.md
Normal file
33
Documentation/02_Developer_Guide/07_Monitor_Internals.md
Normal file
@@ -0,0 +1,33 @@
|
||||
# Developer Guide: Monitor Internals
|
||||
|
||||
This document provides technical details about the implementation of the Directory Monitor script (`monitor.py`) for developers.
|
||||
|
||||
## Overview
|
||||
|
||||
The `monitor.py` script provides an automated way to process assets by monitoring a specified input directory for new ZIP files. It is built using the `watchdog` library.
|
||||
|
||||
## Key Components
|
||||
|
||||
* **`watchdog` Library:** The script relies on the `watchdog` library for monitoring file system events. Specifically, it uses a `PollingObserver` to watch the `INPUT_DIR` for changes.
|
||||
* **`ZipHandler` Class:** This is a custom event handler class defined within `monitor.py`. It inherits from a `watchdog` event handler class (likely `FileSystemEventHandler` or similar, though not explicitly stated in the source text, it's the standard pattern). Its primary method of interest is the one that handles file creation events (`on_created`).
|
||||
* **`main.run_processing`:** The monitor script triggers the main asset processing logic by calling the `run_processing` function from the `main.py` module.
|
||||
|
||||
## Functionality Details
|
||||
|
||||
1. **Watching:** A `PollingObserver` is set up to monitor the directory specified by the `INPUT_DIR` environment variable. Polling is used, checking for changes at a frequency defined by `POLL_INTERVAL`.
|
||||
2. **Event Handling:** The `ZipHandler` is attached to the observer. When a file is created in the monitored directory, the `on_created` method of the `ZipHandler` is triggered.
|
||||
3. **ZIP File Detection:** The `on_created` method checks if the newly created file is a `.zip` file.
|
||||
4. **Filename Parsing:** If it's a ZIP file, the script expects the filename to follow a specific format: `[preset]_filename.zip`. It uses a regular expression (`PRESET_FILENAME_REGEX`, likely defined in `config.py` or similar) to extract the `[preset]` part from the filename.
|
||||
5. **Preset Validation:** It validates whether the extracted `preset` name corresponds to an existing preset JSON file in the `Presets/` directory.
|
||||
6. **Triggering Processing:** If the preset is valid, the `monitor.py` script calls `main.run_processing`, passing the path to the detected ZIP file and the extracted preset name. This initiates the main asset processing pipeline for that single asset. A `PROCESS_DELAY` can be configured to wait before triggering processing, potentially allowing large files to finish copying.
|
||||
7. **Source ZIP Management:** After the processing initiated by `main.run_processing` completes, the original source `.zip` file is moved to either the `PROCESSED_DIR` (if processing was successful or skipped) or the `ERROR_DIR` (if processing failed or the preset was invalid).
|
||||
|
||||
## Configuration
|
||||
|
||||
The monitor's behavior is primarily controlled by environment variables, which are read by the `monitor.py` script. These include `INPUT_DIR`, `OUTPUT_DIR`, `PROCESSED_DIR`, `ERROR_DIR`, `LOG_LEVEL`, `POLL_INTERVAL`, and `NUM_WORKERS`.
|
||||
|
||||
## Limitations
|
||||
|
||||
* The current implementation of the directory monitor does *not* support triggering the optional Blender script execution after processing. This post-processing step is only available when running the tool via the CLI or GUI.
|
||||
|
||||
Understanding the interaction between `watchdog`, the `ZipHandler`, and the call to `main.run_processing` is key to debugging or modifying the directory monitoring functionality.
|
||||
@@ -0,0 +1,55 @@
|
||||
# Developer Guide: Blender Integration Internals
|
||||
|
||||
This document provides technical details about how the Asset Processor Tool integrates with Blender for automated post-processing tasks.
|
||||
|
||||
## Overview
|
||||
|
||||
The tool can optionally execute Python scripts within a Blender instance after successfully processing a batch of assets. This is primarily used to automate the creation of PBR node groups and materials in specified `.blend` files, linking to the newly processed textures and their associated `metadata.json` files.
|
||||
|
||||
## Execution Mechanism
|
||||
|
||||
The Blender scripts are executed by the main orchestrator (`main.py` for CLI, `gui/processing_handler.py` for GUI) using Python's `subprocess.run`.
|
||||
|
||||
* The command executed is typically: `blender -b <target_blend_file> --python <script_path> -- <arguments_for_script>`
|
||||
* `blender`: The path to the Blender executable (configured in `config.py` or found in the system PATH).
|
||||
* `-b`: Runs Blender in background mode (headless, no GUI).
|
||||
* `<target_blend_file>`: The `.blend` file that the script will operate on.
|
||||
* `--python <script_path>`: Specifies the Python script to run within Blender. The scripts are located in the `blenderscripts/` directory (`create_nodegroups.py`, `create_materials.py`).
|
||||
* `--`: This separator is crucial. Arguments placed after `--` are passed to the Python script's `sys.argv`.
|
||||
* `<arguments_for_script>`: The Asset Processor Tool passes necessary information to the Blender scripts via `sys.argv`. Currently, this includes the path to the processed asset's root directory (containing `metadata.json`) and, for the materials script, the path to the `.blend` file containing the node groups.
|
||||
|
||||
* The GUI handler (`gui/processing_handler.py`) also uses `--factory-startup` in the subprocess call, which starts Blender with factory defaults, potentially avoiding issues with user preferences or addons.
|
||||
|
||||
* The orchestrator checks the return code of the `subprocess.run` call. A non-zero return code indicates an error during script execution within Blender. Stdout and stderr from the Blender process are captured and logged by the Asset Processor Tool.
|
||||
|
||||
## Blender Scripts (`blenderscripts/`)
|
||||
|
||||
The `blenderscripts/` directory contains Python scripts designed to be run *inside* a Blender environment. They import Blender's `bpy` module to interact with the `.blend` file data.
|
||||
|
||||
* **`create_nodegroups.py`**:
|
||||
* **Purpose:** Creates or updates PBR node groups in a target `.blend` file based on processed assets.
|
||||
* **Execution:** Typically triggered by the Asset Processor after asset processing. Can also be run manually within Blender's Text Editor.
|
||||
* **Input:** Reads the `metadata.json` file from the processed asset's directory (path received via `sys.argv`).
|
||||
* **Functionality:** Accesses `bpy.data.node_groups` to create or modify node groups. Loads texture images using `bpy.data.images.load`. Sets up nodes and links within the node group. Applies metadata settings (aspect ratio, stats, resolution) to the node group interface or properties. Sets preview images for the node groups. Saves the target `.blend` file.
|
||||
* **Prerequisites (for manual run):** Processed asset library available, target `.blend` file containing template node groups (`Template_PBRSET`, `Template_PBRTYPE`).
|
||||
* **Configuration (for manual run):** Requires setting `PROCESSED_ASSET_LIBRARY_ROOT` internally (overridden by Asset Processor when triggered).
|
||||
|
||||
* **`create_materials.py`**:
|
||||
* **Purpose:** Creates or updates materials in a target `.blend` file that link to the PBR node groups created by `create_nodegroups.py`.
|
||||
* **Execution:** Typically triggered by the Asset Processor after `create_nodegroups.py` has run. Can also be run manually within Blender's Text Editor.
|
||||
* **Input:** Reads the `metadata.json` file (path via `sys.argv`). Receives the path to the `.blend` file containing the node groups (via `sys.argv`).
|
||||
* **Functionality:** Accesses `bpy.data.materials` to create or modify materials. Copies a template material (`Template_PBRMaterial`). Links the corresponding PBRSET node group from the specified library `.blend` file using `bpy.data.libraries.load`. Replaces a placeholder node (`PLACEHOLDER_NODE_LABEL`) in the template material's node tree with the linked node group. Marks the material as an asset. Copies tags. Sets preview images and viewport properties. Saves the target `.blend` file.
|
||||
* **Prerequisites (for manual run):** Processed asset library available, the `.blend` file containing the PBRSET node groups, and the *current* target `.blend` file must contain a template material named `Template_PBRMaterial` with a Group node labeled `PLACEHOLDER_NODE_LABEL`.
|
||||
* **Configuration (for manual run):** Requires setting `PROCESSED_ASSET_LIBRARY_ROOT` and `NODEGROUP_BLEND_FILE_PATH` internally (overridden by Asset Processor when triggered). Constants like `TEMPLATE_MATERIAL_NAME` and `PLACEHOLDER_NODE_LABEL` can be adjusted.
|
||||
|
||||
## Data Flow
|
||||
|
||||
Information is passed from the Asset Processor to the Blender scripts primarily through the `metadata.json` file and command-line arguments (`sys.argv`). The scripts read the `metadata.json` to get details about the processed maps, stats, etc., and use the provided paths to locate the necessary files and target `.blend` files.
|
||||
|
||||
## Limitations
|
||||
|
||||
* The Directory Monitor (`monitor.py`) does not currently support triggering the Blender integration scripts.
|
||||
* Cancellation of the main Asset Processor process does not necessarily stop a Blender subprocess that has already been launched.
|
||||
* The Blender scripts rely on specific naming conventions for template node groups and materials within the target `.blend` files.
|
||||
|
||||
Understanding the subprocess execution, the data passed via `sys.argv` and `metadata.json`, and the `bpy` API usage within the scripts is essential for developing or debugging the Blender integration.
|
||||
50
Documentation/02_Developer_Guide/09_Development_Workflow.md
Normal file
50
Documentation/02_Developer_Guide/09_Development_Workflow.md
Normal file
@@ -0,0 +1,50 @@
|
||||
# Developer Guide: Development Workflow
|
||||
|
||||
This document provides guidance for developers on the typical workflow for contributing to or modifying the Asset Processor Tool.
|
||||
|
||||
## Modifying Core Processing Logic
|
||||
|
||||
Changes to how assets are classified, maps are processed (resizing, format conversion, bit depth handling), channels are merged, or metadata is generated primarily involve editing the `AssetProcessor` class in `asset_processor.py`. Understanding the processing pipeline steps outlined in `05_Processing_Pipeline.md` is crucial here.
|
||||
|
||||
## Changing Global Settings/Rules
|
||||
|
||||
Adjustments to default output paths, standard image resolutions, default format rules, map merge definitions, Blender paths, or other global constants should be made in `config.py`.
|
||||
|
||||
## Adding/Modifying Supplier Rules (Presets)
|
||||
|
||||
To add support for a new asset source or change how an existing one is interpreted, you need to create or edit the corresponding JSON file in the `Presets/` directory.
|
||||
|
||||
* Use `Presets/_template.json` as a base for new presets.
|
||||
* Focus on defining accurate regex patterns and rules in fields like `map_type_mapping`, `bit_depth_variants`, `model_patterns`, `source_naming_convention`, etc.
|
||||
* Refer to `04_Configuration_System_and_Presets.md` for a detailed explanation of the preset file structure and the configuration loading process.
|
||||
|
||||
## Adjusting CLI Behavior
|
||||
|
||||
Changes to command-line arguments, argument parsing logic, or the overall CLI workflow are handled in `main.py`. This includes how arguments are parsed using `argparse`, how parallel processing is orchestrated, and how Blender scripts are triggered from the CLI.
|
||||
|
||||
## Modifying the GUI
|
||||
|
||||
Work on the Graphical User Interface involves the files within the `gui/` directory.
|
||||
|
||||
* UI layout changes, adding new controls, or altering event handling are typically done in `main_window.py`.
|
||||
* Modifications to how background processing tasks are managed for the GUI are handled in `processing_handler.py`.
|
||||
* Changes to how file classification previews are generated and updated in the UI are in `prediction_handler.py`.
|
||||
* Understanding Qt's signals and slots mechanism and the use of `QThread` and `ProcessPoolExecutor` (as detailed in `06_GUI_Internals.md`) is essential for GUI development.
|
||||
|
||||
## Enhancing Blender Integration
|
||||
|
||||
Improvements or changes to how node groups or materials are created in Blender require editing the Python scripts within the `blenderscripts/` directory (`create_nodegroups.py`, `create_materials.py`).
|
||||
|
||||
* These scripts are designed to be executed *within* Blender and interact with Blender's `bpy` API.
|
||||
* Consider how these scripts are invoked by the Asset Processor (via subprocess calls) and what data they expect (primarily from `metadata.json` and `sys.argv`).
|
||||
* Refer to `08_Blender_Integration_Internals.md` for details on the execution mechanism and script specifics.
|
||||
|
||||
## General Development Practices
|
||||
|
||||
* Adhere to the project's coding conventions (see `10_Coding_Conventions.md`).
|
||||
* Utilize the standard Python `logging` module for outputting information and debugging messages.
|
||||
* Use `try...except` blocks for error handling, and leverage the custom exceptions (`ConfigurationError`, `AssetProcessingError`) where appropriate.
|
||||
* When working with file paths, use `pathlib.Path` for consistency and robustness.
|
||||
* Be mindful of concurrency when working with the GUI or parallel processing in the CLI.
|
||||
|
||||
This workflow provides a general guide; specific tasks may require delving into multiple files and understanding the interactions between different components.
|
||||
34
Documentation/02_Developer_Guide/10_Coding_Conventions.md
Normal file
34
Documentation/02_Developer_Guide/10_Coding_Conventions.md
Normal file
@@ -0,0 +1,34 @@
|
||||
# Developer Guide: Coding Conventions
|
||||
|
||||
This document outlines the coding conventions and general practices followed within the Asset Processor Tool codebase. Adhering to these conventions helps maintain consistency and readability.
|
||||
|
||||
## General Principles
|
||||
|
||||
* **Readability:** Code should be easy to read and understand. Use clear variable and function names.
|
||||
* **Consistency:** Follow existing patterns and styles within the codebase.
|
||||
* **Maintainability:** Write code that is easy to modify and extend.
|
||||
* **Explicitness:** Be explicit rather than implicit.
|
||||
|
||||
## Specific Conventions
|
||||
|
||||
* **Object-Oriented Programming (OOP):** The codebase heavily utilizes classes to structure the application logic (e.g., `AssetProcessor`, `Configuration`, `MainWindow`, various Handlers). Follow standard OOP principles.
|
||||
* **Type Hinting:** Use Python type hints throughout the code to indicate the expected types of function arguments, return values, and variables. This improves code clarity and allows for static analysis.
|
||||
* **Logging:** Use the standard Python `logging` module for all output messages (information, warnings, errors, debug). Avoid using `print()` for application output. Configure log levels appropriately. The GUI uses a custom `QtLogHandler` to integrate logging with the UI.
|
||||
* **Error Handling:** Use standard `try...except` blocks to handle potential errors gracefully. Define and use custom exceptions (e.g., `ConfigurationError`, `AssetProcessingError`) for specific error conditions within the application logic. Log exceptions with `exc_info=True` to include traceback information.
|
||||
* **Parallelism:** When implementing CPU-bound tasks that can be run concurrently, use `concurrent.futures.ProcessPoolExecutor` as demonstrated in `main.py` and `gui/processing_handler.py`. Ensure that shared state is handled correctly (e.g., by instantiating necessary objects within worker processes).
|
||||
* **GUI Development (`PySide6`):**
|
||||
* Use Qt's signals and slots mechanism for communication between objects, especially across threads.
|
||||
* Run long-running or blocking tasks in separate `QThread`s to keep the main UI thread responsive.
|
||||
* Perform UI updates only from the main UI thread.
|
||||
* **Configuration:** Core settings are managed in `config.py` (Python module). Supplier-specific rules are managed in JSON files (`Presets/`). The `Configuration` class handles loading and merging these.
|
||||
* **File Paths:** Use `pathlib.Path` objects for handling file system paths. Avoid using string manipulation for path joining or parsing.
|
||||
* **Docstrings:** Write clear and concise docstrings for modules, classes, methods, and functions, explaining their purpose, arguments, and return values.
|
||||
* **Comments:** Use comments to explain complex logic or non-obvious parts of the code.
|
||||
* **Imports:** Organize imports at the top of the file, grouped by standard library, third-party libraries, and local modules.
|
||||
* **Naming:**
|
||||
* Use `snake_case` for function and variable names.
|
||||
* Use `PascalCase` for class names.
|
||||
* Use `UPPER_CASE` for constants.
|
||||
* Use a leading underscore (`_`) for internal or "protected" methods/attributes.
|
||||
|
||||
Adhering to these conventions will make the codebase more consistent, easier to understand, and more maintainable for all contributors.
|
||||
147
Documentation/02_Developer_Guide/11_Debugging_Notes.md
Normal file
147
Documentation/02_Developer_Guide/11_Debugging_Notes.md
Normal file
@@ -0,0 +1,147 @@
|
||||
# Developer Guide: Debugging Notes
|
||||
|
||||
This document provides deeper technical details about the internal workings of the Asset Processor Tool, intended to aid in debugging unexpected behavior.
|
||||
|
||||
## Internal Logic & Algorithms
|
||||
|
||||
* **Configuration Preparation (`Configuration` class in `configuration.py`):**
|
||||
* Instantiated per preset (`__init__`).
|
||||
* Loads core settings from `config.py` using `importlib.util`.
|
||||
* Loads specified preset from `presets/{preset_name}.json`.
|
||||
* Validates basic structure of loaded settings (`_validate_configs`), checking for required keys and basic types (e.g., `map_type_mapping` is a list of dicts).
|
||||
* Compiles regex patterns (`_compile_regex_patterns`) from preset rules (extra, model, bit depth, map keywords) using `re.compile` (mostly case-insensitive) and stores them on the instance (e.g., `self.compiled_map_keyword_regex`). Uses `_fnmatch_to_regex` helper for basic wildcard conversion.
|
||||
|
||||
* **CLI Argument Parsing (`main.py:setup_arg_parser`):**
|
||||
* Uses `argparse` to define and parse command-line arguments.
|
||||
* Key arguments influencing flow: `--preset` (required), `--output-dir` (optional override), `--workers` (concurrency), `--overwrite` (force reprocessing), `--verbose` (logging level), `--nodegroup-blend`, `--materials-blend`.
|
||||
* Calculates a default worker count based on `os.cpu_count()`.
|
||||
|
||||
* **Output Directory Resolution (`main.py:main`):**
|
||||
* Determines the base output directory by checking `--output-dir` argument first, then falling back to `OUTPUT_BASE_DIR` from `config.py`.
|
||||
* Resolves the path to an absolute path and ensures the directory exists (`Path.resolve()`, `Path.mkdir(parents=True, exist_ok=True)`).
|
||||
|
||||
* **Asset Processing (`AssetProcessor` class in `asset_processor.py`):**
|
||||
* **Classification (`_inventory_and_classify_files`):**
|
||||
* Multi-pass approach: Explicit Extra (regex) -> Models (regex) -> Potential Maps (keyword regex) -> Standalone 16-bit check (regex) -> Prioritize 16-bit variants -> Final Maps -> Remaining as Unrecognised (Extra).
|
||||
* Uses compiled regex patterns provided by the `Configuration` object passed during initialization.
|
||||
* Sorts potential map variants based on: 1. Preset rule index, 2. Keyword index within rule, 3. Alphabetical path. Suffixes (`-1`, `-2`) are assigned later per-asset based on this sort order and `RESPECT_VARIANT_MAP_TYPES`.
|
||||
* **Map Processing (`_process_maps`):**
|
||||
* Loads images using `cv2.imread` (flags: `IMREAD_UNCHANGED` or `IMREAD_GRAYSCALE`). Converts loaded 3-channel images from BGR to RGB for internal consistency (stats, merging).
|
||||
* **Saving Channel Order:** Before saving with `cv2.imwrite`, 3-channel images are conditionally converted back from RGB to BGR *only* if the target output format is *not* EXR (e.g., for PNG, JPG, TIF). This ensures correct channel order for standard formats while preserving RGB for EXR. (Fix for ISSUE-010).
|
||||
* Handles Gloss->Roughness inversion: Loads gloss, inverts using float math (`1.0 - img/norm`), stores as float32 with original dtype. Prioritizes gloss source if both gloss and native rough exist.
|
||||
* Resizes using `cv2.resize` (interpolation: `INTER_LANCZOS4` for downscale, `INTER_CUBIC` for potential same-size/upscale - though upscaling is generally avoided by checks).
|
||||
* Determines output format based on hierarchy: `FORCE_LOSSLESS_MAP_TYPES` > `RESOLUTION_THRESHOLD_FOR_JPG` > Input format priority (TIF/EXR often lead to lossless) > Configured defaults (`OUTPUT_FORMAT_16BIT_PRIMARY`, `OUTPUT_FORMAT_8BIT`).
|
||||
* Determines output bit depth based on `MAP_BIT_DEPTH_RULES` ('respect' vs 'force_8bit').
|
||||
* Converts dtype before saving (e.g., float to uint8/uint16 using scaling factors 255.0/65535.0).
|
||||
* Calculates stats (`_calculate_image_stats`) on normalized float64 data (in RGB space) for a specific resolution (`CALCULATE_STATS_RESOLUTION`).
|
||||
* Calculates aspect ratio string (`_normalize_aspect_ratio_change`) based on relative dimension changes.
|
||||
* Handles save fallback: If primary 16-bit format (e.g., EXR) fails, attempts fallback (e.g., PNG).
|
||||
* **Merging (`_merge_maps_from_source`):**
|
||||
* Identifies the required *source* files for merge inputs based on classified files.
|
||||
* Determines common resolutions based on available processed maps (as a proxy for size compatibility).
|
||||
* Loads required source maps for each common resolution using the `_load_and_transform_source` helper (utilizing the cache).
|
||||
* Converts loaded inputs to float32 (normalized 0-1).
|
||||
* Injects default values (from rule `defaults`) for missing channels.
|
||||
* Merges channels using `cv2.merge`.
|
||||
* Determines output bit depth based on rule (`force_16bit`, `respect_inputs`).
|
||||
* Determines output format based on complex rules (`config.py` and preset), considering the highest format among *source* inputs if not forced lossless or over JPG threshold. Handles JPG 16-bit conflict by forcing 8-bit.
|
||||
* Saves the merged image using the `_save_image` helper, including final data type/color space conversions and fallback logic (e.g., EXR->PNG).
|
||||
* **Metadata (`_determine_base_metadata`, `_determine_single_asset_metadata`, `_generate_metadata_file`):**
|
||||
* Base name determined using `source_naming` separator/index from `Configuration`, with fallback to common prefix or input name. Handles multiple assets within one input.
|
||||
* Category determined by model presence or `decal_keywords` from `Configuration`.
|
||||
* Archetype determined by matching keywords in `archetype_rules` (from `Configuration`) against file stems/base name.
|
||||
* Final `metadata.json` populated by accumulating results (map details, stats, features, etc.) during the per-asset processing loop.
|
||||
|
||||
* **Blender Integration (`main.py:run_blender_script`, `gui/processing_handler.py:_run_blender_script_subprocess`):**
|
||||
* Uses `subprocess.run` to execute Blender.
|
||||
* Command includes `-b` (background), the target `.blend` file, `--python` followed by the script path (`blenderscripts/*.py`), and `--` separator.
|
||||
* Arguments after `--` (currently just the `asset_root_dir`, and optionally the nodegroup blend path for the materials script) are passed to the Python script via `sys.argv`.
|
||||
* Uses `--factory-startup` in GUI handler. Checks return code and logs stdout/stderr.
|
||||
|
||||
## State Management
|
||||
|
||||
* **`Configuration` Object:** Holds the loaded and merged configuration state (core + preset) and compiled regex patterns. Designed to be immutable after initialization. Instantiated once per worker process.
|
||||
* **`AssetProcessor` Instance:** Primarily stateless between calls to `process()`. State *within* a `process()` call is managed through local variables scoped to the overall call or the per-asset loop (e.g., `current_asset_metadata`, `processed_maps_details_asset`). `self.classified_files` is populated once by `_inventory_and_classify_files` early in `process()` and then used read-only (filtered copies) within the per-asset loop.
|
||||
* **`main.py` (CLI):** Tracks overall run progress (processed, skipped, failed counts) based on results returned from worker processes.
|
||||
* **`gui/processing_handler.py`:** Manages the state of a GUI processing run using internal flags (`_is_running`, `_cancel_requested`) and stores `Future` objects in `self._futures` dictionary while the pool is active.
|
||||
* **Image Data:** `numpy.ndarray` (Handled by OpenCV).
|
||||
|
||||
## Error Handling & Propagation
|
||||
|
||||
* **Custom Exceptions:** `ConfigurationError` (raised by `Configuration` on load/validation failure), `AssetProcessingError` (raised by `AssetProcessor` for various processing failures).
|
||||
* **Configuration:** `ConfigurationError` halts initialization. Regex compilation errors are logged as warnings but do not stop initialization.
|
||||
* **AssetProcessor:** Uses `try...except Exception` within key pipeline steps (`_process_maps`, `_merge_maps`, etc.) and within the per-asset loop in `process()`. Errors specific to one asset are logged (`log.error(exc_info=True)`), the asset is marked "failed" in the returned status dictionary, and the loop continues to the next asset. Critical setup errors (e.g., workspace creation) raise `AssetProcessingError`, halting the entire `process()` call. Includes specific save fallback logic (EXR->PNG) on `cv2.imwrite` failure for 16-bit formats.
|
||||
* **Worker Wrapper (`main.py:process_single_asset_wrapper`):** Catches `ConfigurationError`, `AssetProcessingError`, and general `Exception` during worker execution. Logs the error and returns a ("failed", error_message) status tuple to the main process.
|
||||
* **Process Pool (`main.py`, `gui/processing_handler.py`):** The `with ProcessPoolExecutor(...)` block handles pool setup/teardown. A `try...except` around `as_completed` or `future.result()` catches critical worker failures (e.g., process crash).
|
||||
* **GUI Communication (`ProcessingHandler`):** Catches exceptions during `future.result()` retrieval. Emits `file_status_updated` signal with "failed" status and error message. Emits `processing_finished` with final counts.
|
||||
* **Blender Scripts:** Checks `subprocess.run` return code. Logs stderr as ERROR if return code is non-zero, otherwise as WARNING. Catches `FileNotFoundError` if the Blender executable path is invalid.
|
||||
|
||||
## Key Data Structures
|
||||
|
||||
* **`Configuration` Instance Attributes:**
|
||||
* `compiled_map_keyword_regex`: `dict[str, list[tuple[re.Pattern, str, int]]]` (Base type -> list of compiled regex tuples)
|
||||
* `compiled_extra_regex`, `compiled_model_regex`: `list[re.Pattern]`
|
||||
* `compiled_bit_depth_regex_map`: `dict[str, re.Pattern]` (Base type -> compiled regex)
|
||||
* **`AssetProcessor` Internal Structures (within `process()`):**
|
||||
* `self.classified_files`: `dict[str, list[dict]]` (Category -> list of file info dicts like `{'source_path': Path, 'map_type': str, ...}`)
|
||||
* `processed_maps_details_asset`, `merged_maps_details_asset`: `dict[str, dict[str, dict]]` (Map Type -> Resolution Key -> Details Dict `{'path': Path, 'width': int, ...}`)
|
||||
* `file_to_base_name_map`: `dict[Path, Optional[str]]` (Source relative path -> Determined asset base name or None)
|
||||
* `current_asset_metadata`: `dict` (Accumulates name, category, archetype, stats, map details per asset)
|
||||
* **Return Values:**
|
||||
* `AssetProcessor.process()`: `Dict[str, List[str]]` (e.g., `{"processed": [...], "skipped": [...], "failed": [...]}`)
|
||||
* `main.process_single_asset_wrapper()`: `Tuple[str, str, Optional[str]]` (input_path, status_string, error_message)
|
||||
* **`ProcessingHandler._futures`:** `dict[Future, str]` (Maps `concurrent.futures.Future` object to the input path string)
|
||||
* **Image Data:** `numpy.ndarray` (Handled by OpenCV).
|
||||
|
||||
## Concurrency Models (CLI & GUI)
|
||||
|
||||
* **Common Core:** Both CLI and GUI utilize `concurrent.futures.ProcessPoolExecutor` for parallel processing. The target function executed by workers is `main.process_single_asset_wrapper`.
|
||||
* **Isolation:** Crucially, `Configuration` and `AssetProcessor` objects are instantiated *within* the `process_single_asset_wrapper` function, meaning each worker process gets its own independent configuration and processor instance based on the arguments passed. This prevents state conflicts between concurrent asset processing tasks. Data is passed between the main process and workers via pickling of arguments and return values.
|
||||
* **CLI Orchestration (`main.py:run_processing`):**
|
||||
* Creates the `ProcessPoolExecutor`.
|
||||
* Submits all `process_single_asset_wrapper` tasks.
|
||||
* Uses `concurrent.futures.as_completed` to iterate over finished futures as they complete, blocking until the next one is done.
|
||||
* Gathers results synchronously within the main script's execution flow.
|
||||
* **GUI Orchestration (`gui/processing_handler.py`):**
|
||||
* The `ProcessingHandler` object (a `QObject`) contains the `run_processing` method.
|
||||
* This method is intended to be run in a separate `QThread` (managed by `MainWindow`) to avoid blocking the main UI thread.
|
||||
* Inside `run_processing`, it creates and manages the `ProcessPoolExecutor`.
|
||||
* It uses `as_completed` similarly to the CLI to iterate over finished futures.
|
||||
* **Communication:** Instead of blocking the thread gathering results, it emits Qt signals (`progress_updated`, `file_status_updated`, `processing_finished`) from within the `as_completed` loop. These signals are connected to slots in `MainWindow` (running on the main UI thread), allowing for thread-safe updates to the GUI (progress bar, table status, status bar messages).
|
||||
* **Cancellation (GUI - `gui/processing_handler.py:request_cancel`):**
|
||||
* Sets an internal `_cancel_requested` flag.
|
||||
* Attempts `executor.shutdown(wait=False)` which prevents new tasks from starting and may cancel pending ones (depending on Python version).
|
||||
* Manually iterates through stored `_futures` and calls `future.cancel()` on those not yet running or done.
|
||||
* **Limitation:** This does *not* forcefully terminate worker processes that are already executing the `process_single_asset_wrapper` function. Cancellation primarily affects pending tasks and the processing of results from already running tasks (they will be marked as failed/cancelled when their future completes).
|
||||
|
||||
## Resource Management
|
||||
|
||||
* **Configuration:** Preset JSON files are opened and closed using `with open(...)`.
|
||||
* **AssetProcessor:**
|
||||
* Temporary workspace directory created using `tempfile.mkdtemp()`.
|
||||
* Cleanup (`_cleanup_workspace`) uses `shutil.rmtree()` and is called within a `finally` block in the main `process()` method, ensuring cleanup attempt even if errors occur.
|
||||
* Metadata JSON file written using `with open(...)`.
|
||||
* Image data is loaded into memory using OpenCV/NumPy; memory usage depends on image size and number of concurrent workers.
|
||||
* **Process Pool:** The `ProcessPoolExecutor` manages the lifecycle of worker processes. Using it within a `with` statement (as done in `main.py` and `gui/processing_handler.py`) ensures proper shutdown and resource release for the pool itself.
|
||||
|
||||
## Known Limitations & Edge Cases
|
||||
|
||||
* **Configuration:**
|
||||
* Validation (`_validate_configs`) is primarily structural (key presence, basic types), not deeply logical (e.g., doesn't check if regex patterns are *sensible*).
|
||||
* Regex compilation errors in `_compile_regex_patterns` are logged as warnings but don't prevent `Configuration` initialization, potentially leading to unexpected classification later.
|
||||
* `_fnmatch_to_regex` helper only handles basic `*` and `?` wildcards. Complex fnmatch patterns might not translate correctly.
|
||||
* **AssetProcessor:**
|
||||
* Heavily reliant on correct filename patterns and rules defined in presets. Ambiguous or incorrect patterns lead to misclassification.
|
||||
* Potential for high memory usage when processing very large images, especially with many workers.
|
||||
* Error handling within `process()` is per-asset; a failure during map processing for one asset marks the whole asset as failed, without attempting other maps for that asset. No partial recovery within an asset.
|
||||
* Gloss->Roughness inversion assumes gloss map is single channel or convertible to grayscale.
|
||||
* `predict_output_structure` and `get_detailed_file_predictions` use simplified logic (e.g., assuming PNG output, highest resolution only) and may not perfectly match final output names/formats in all cases.
|
||||
* Filename sanitization (`_sanitize_filename`) is basic and might not cover all edge cases for all filesystems.
|
||||
* **CLI (`main.py`):**
|
||||
* Preset existence check (`{preset}.json`) happens only in the main process before workers start.
|
||||
* Blender executable finding logic relies on `config.py` path being valid or `blender` being in the system PATH.
|
||||
* **GUI Concurrency (`gui/processing_handler.py`):**
|
||||
* Cancellation (`request_cancel`) is not immediate for tasks already running in worker processes. It prevents new tasks and stops processing results from completed futures once the flag is checked.
|
||||
* **General:**
|
||||
* Limited input format support (ZIP archives, folders). Internal file formats limited by OpenCV (`cv2.imread`, `cv2.imwrite`). Optional `OpenEXR` package recommended for full EXR support.
|
||||
* Error messages propagated from workers might lack full context in some edge cases.
|
||||
Reference in New Issue
Block a user