Initial Work on data-transfer refactor

This commit is contained in:
Rusfort 2025-04-29 23:11:44 +02:00
parent f66971f3fd
commit a6dc49ebb1
16 changed files with 1092 additions and 246 deletions

View File

@ -12,7 +12,7 @@ This documentation strictly excludes details on environment setup, dependency in
## Architecture and Codebase Summary
For developers interested in contributing, the tool's architecture is designed around a **Core Processing Engine** (`asset_processor.py`) that handles the pipeline for single assets, supported by a **Configuration System** (`configuration.py` and `config.py` with `Presets/*.json`). Multiple interfaces are provided: a **Graphical User Interface** (`gui/`), a **Command-Line Interface** (`main.py`), and a **Directory Monitor** (`monitor.py`). Optional **Blender Integration** (`blenderscripts/`) is also included.
For developers interested in contributing, the tool's architecture is designed around a **Core Processing Engine** (`asset_processor.py`) that handles the pipeline for single assets. This engine is supported by a **Configuration System** (`configuration.py` and `config.py` with `Presets/*.json`) and a new **Hierarchical Rule System** (`rule_structure.py`) that allows dynamic overrides of static configurations at Source, Asset, and File levels. Multiple interfaces are provided: a **Graphical User Interface** (`gui/`), a **Command-Line Interface** (`main.py`), and a **Directory Monitor** (`monitor.py`). Optional **Blender Integration** (`blenderscripts/`) is also included. Key new files supporting the hierarchical rule system include `rule_structure.py`, `gui/rule_hierarchy_model.py`, and `gui/rule_editor_widget.py`.
The codebase is organized into key directories and files reflecting these components. The `gui/` directory contains all GUI-related code, `Presets/` holds configuration presets, and `blenderscripts/` contains scripts for Blender interaction. The core logic resides in files like `asset_processor.py`, `configuration.py`, `config.py`, `main.py`, and `monitor.py`. The processing pipeline involves steps such as file classification, map processing, channel merging, and metadata generation.

View File

@ -15,6 +15,16 @@ The Asset Processor Tool is designed to process 3D asset source files into a sta
These interfaces exchange data structures containing information about each file and asset set being processed.
4. **Optional Integration:** Includes scripts and logic for integrating with external software, specifically Blender, to automate material and node group creation.
## Hierarchical Rule System
A key addition to the architecture is the **Hierarchical Rule System**, which provides a dynamic layer of configuration that can override the static settings loaded from presets. This system is represented by a hierarchy of rule objects: `SourceRule`, `AssetRule`, and `FileRule`.
* **SourceRule:** Represents rules applied at the top level, typically corresponding to an entire input source (e.g., a ZIP file or folder).
* **AssetRule:** Represents rules applied to a specific asset within a source (a source can contain multiple assets).
* **FileRule:** Represents rules applied to individual files within an asset.
This hierarchy allows for fine-grained control over processing parameters. When the processing engine needs a configuration value (e.g., a naming convention or image processing setting), it first checks the `FileRule` for the specific file, then the `AssetRule` for the asset containing the file, then the `SourceRule` for the overall source. If a value is not found at any of these dynamic levels, it falls back to the static configuration loaded from the selected preset. This prioritization logic (File > Asset > Source > Static Preset) ensures that the most specific rule available is always used.
## Core Components
* `config.py`: Defines core, global settings and constants.

View File

@ -14,7 +14,32 @@ The `AssetProcessor` class is the central engine of the tool. It is responsible
* Merging channels from different maps according to merge rules.
* Generating the `metadata.json` file containing details about the processed asset.
* Organizing the final output files into the structured library directory.
* Providing methods (`get_detailed_file_predictions`) used by the GUI for previewing file classification.
* Accepting a `SourceRule` object which represents the hierarchical rules (Source, Asset, File) that can override static configuration values.
* Implementing the logic (`_get_rule_with_fallback`) to retrieve configuration values by prioritizing rules in the order: File -> Asset -> Source -> Static Config.
* Applying this hierarchical rule logic during file classification, prediction, image processing, and metadata generation.
* Providing methods (`get_detailed_file_predictions`) used by the GUI for previewing file classification, now incorporating the hierarchical rules.
## `Rule Structure` (`rule_structure.py`)
This module defines the data structures used to represent the hierarchical processing rules:
* `SourceRule`: A dataclass representing rules applied at the source level. It contains nested `AssetRule` objects.
* `AssetRule`: A dataclass representing rules applied at the asset level. It contains nested `FileRule` objects.
* `FileRule`: A dataclass representing rules applied at the file level.
These classes hold specific rule parameters (e.g., `supplier_identifier`, `asset_type`, `map_type_override`) and support serialization (Pickle, JSON) to allow them to be passed between different parts of the application, including across process boundaries.
## `Rule Hierarchy Model` (`gui/rule_hierarchy_model.py`)
The `RuleHierarchyModel` implements a `QAbstractItemModel` for use with Qt's model-view architecture. It is specifically designed to:
* Wrap a `SourceRule` object and expose its hierarchical structure (Source -> Asset -> File) to a `QTreeView`.
* Provide methods (`data`, `index`, `parent`, `rowCount`, `columnCount`) required by `QAbstractItemModel` to allow the `QTreeView` to display the rule hierarchy.
* Enable navigation and selection within the rule hierarchy in the GUI.
## `Rule Editor Widget` (`gui/rule_editor_widget.py`)
The `RuleEditorWidget` is a custom Qt widget that provides a user interface for viewing and editing the attributes of a selected rule object (`SourceRule`, `AssetRule`, or `FileRule`). It dynamically generates input fields (e.g., line edits, checkboxes) based on the attributes of the rule object it is currently displaying. It allows users to modify the dynamic rule parameters through the GUI.
## `Configuration` (`configuration.py`)

View File

@ -2,7 +2,9 @@
This document details the step-by-step technical process executed by the `AssetProcessor` class (`asset_processor.py`) when processing a single asset.
The `AssetProcessor.process()` method orchestrates the following pipeline:
The `AssetProcessor.process()` method orchestrates the following pipeline. A key aspect of this pipeline is the integration of the **Hierarchical Rule System**, which allows dynamic rules (Source, Asset, File) to override static configuration values loaded from presets. The `_get_rule_with_fallback` helper method is used throughout the pipeline to retrieve configuration values, prioritizing rules in the order: File -> Asset -> Source -> Static Config.
The pipeline steps are:
1. **Workspace Setup (`_setup_workspace`)**:
* Creates a temporary directory using `tempfile.mkdtemp()` to isolate the processing of the current asset.
@ -14,7 +16,7 @@ The `AssetProcessor.process()` method orchestrates the following pipeline:
3. **File Inventory and Classification (`_inventory_and_classify_files`)**:
* Scans the contents of the temporary workspace.
* Uses the pre-compiled regex patterns from the loaded `Configuration` object to classify each file.
* Uses the pre-compiled regex patterns from the loaded `Configuration` object and the rules from the `SourceRule` object to classify each file. The hierarchical rules can influence classification patterns and other parameters used in this step.
* Classification follows a multi-pass approach for priority:
* Explicitly marked `Extra/` files (using `move_to_extra_patterns` regex).
* Model files (using `model_patterns` regex).
@ -27,6 +29,7 @@ The `AssetProcessor.process()` method orchestrates the following pipeline:
* Sorts potential map variants based on preset rule order, keyword order within the rule, and finally alphabetical path to determine suffix assignment priority (`-1`, `-2`, etc.).
4. **Base Metadata Determination (`_determine_base_metadata`, `_determine_single_asset_metadata`)**:
* This step now utilizes the hierarchical rules from the `SourceRule` object. For example, the `supplier_identifier`, `asset_type`, and `asset_name_override` attributes from the rules can override the values determined from the static configuration or input file names. The `_get_rule_with_fallback` method is used here to apply the rule prioritization logic.
* Determines the base asset name using `source_naming_convention` rules from the `Configuration` (separators, indices), with fallbacks to common prefixes or the input name. Handles multiple distinct assets within a single input source.
* Determines the asset category (`Texture`, `Asset`, `Decal`) based on the presence of model files or `decal_keywords` in the `Configuration`.
* Determines the asset archetype (e.g., `Wood`, `Metal`) by matching keywords from `archetype_rules` (in `Configuration`) against file stems or the determined base name.
@ -37,7 +40,7 @@ The `AssetProcessor.process()` method orchestrates the following pipeline:
* If both exist, processing for this specific asset is skipped, marked as "skipped", and the pipeline moves to the next asset (if processing multiple assets from one source) or finishes.
6. **Map Processing (`_process_maps`)**:
* Iterates through the files classified as texture maps for the current asset.
* Iterates through the files classified as texture maps for the current asset. Configuration values used in this step, such as target resolutions, bit depth rules, and output format rules, are now retrieved using the `_get_rule_with_fallback` method, allowing them to be overridden by the hierarchical rules.
* Loads the image using `cv2.imread` (handling grayscale and unchanged flags). Converts BGR to RGB internally for consistency (except for saving non-EXR formats).
* Handles Glossiness-to-Roughness inversion if necessary (loads gloss, inverts `1.0 - img/norm`, prioritizes gloss source if both exist).
* Resizes the image to target resolutions defined in `IMAGE_RESOULTIONS` (from `Configuration`) using `cv2.resize` (`INTER_LANCZOS4` for downscaling). Upscaling is generally avoided by checks.
@ -53,6 +56,7 @@ The `AssetProcessor.process()` method orchestrates the following pipeline:
* Stores details about each processed map (path, resolution, format, stats, etc.) in `processed_maps_details_asset`.
7. **Map Merging (`_merge_maps_from_source`)**:
* The `MAP_MERGE_RULES` themselves can be influenced by the hierarchical rules. Additionally, parameters within the merge rules, such as default channel values or output format/bit depth, are retrieved using `_get_rule_with_fallback`, allowing dynamic overrides.
* Iterates through the `MAP_MERGE_RULES` defined in the `Configuration`.
* Identifies the required *source* map files needed as input for each merge rule based on the classified files.
* Determines common resolutions available across the required input maps.
@ -65,6 +69,7 @@ The `AssetProcessor.process()` method orchestrates the following pipeline:
* Stores details about each merged map in `merged_maps_details_asset`.
8. **Metadata File Generation (`_generate_metadata_file`)**:
* The metadata generated now includes information derived from the applied hierarchical rules, in addition to the static configuration and processing results.
* Collects all determined information for the current asset: base metadata, details from `processed_maps_details_asset` and `merged_maps_details_asset`, list of ignored files, source preset used, etc.
* Writes this collected data into the `metadata.json` file within the temporary workspace using `json.dump`.

View File

@ -16,12 +16,14 @@ The `MainWindow` class is the central component of the GUI application. It is re
* Connecting user interactions (button clicks, drag-and-drop events, checkbox states, spinbox values) to corresponding methods (slots) within the `MainWindow` or other handler classes.
* Managing the display of application logs in the UI console using a custom `QtLogHandler`.
* Interacting with background handlers (`ProcessingHandler`, `PredictionHandler`) via Qt signals and slots to ensure thread-safe updates to the UI during long-running operations.
* Integrating the `QTreeView` (displaying the rule hierarchy via `RuleHierarchyModel`) and the `RuleEditorWidget` for interactive rule editing.
* Connecting the selection changes in the `QTreeView` to update the `RuleEditorWidget` with the selected rule object's attributes.
## Threading and Background Tasks
To keep the UI responsive during intensive operations like asset processing and file preview generation, the GUI utilizes background threads managed by `QThread`.
* **`ProcessingHandler` (`gui/processing_handler.py`):** This class is designed to run in a separate `QThread`. It manages the execution of the main asset processing pipeline for multiple assets concurrently using `concurrent.futures.ProcessPoolExecutor`. It submits individual asset processing tasks to the pool and monitors their completion. It uses Qt signals to communicate progress updates, file status changes, and overall processing completion back to the `MainWindow` on the main UI thread. It also handles the execution of optional Blender scripts via subprocess calls after processing. This handler processes and utilizes data structures received from the core processing engine, such as status summaries.
* **`ProcessingHandler` (`gui/processing_handler.py`):** This class is designed to run in a separate `QThread`. It manages the execution of the main asset processing pipeline for multiple assets concurrently using `concurrent.futures.ProcessPoolExecutor`. It submits individual asset processing tasks to the pool and monitors their completion. It uses Qt signals to communicate progress updates, file status changes, and overall processing completion back to the `MainWindow` on the main UI thread. It also handles the execution of optional Blender scripts via subprocess calls after processing. This handler processes and utilizes data structures received from the core processing engine, such as status summaries. It receives the initial `SourceRule` object generated by the `PredictionHandler` and passes it to the `AssetProcessor` for processing.
* **`PredictionHandler` (`gui/prediction_handler.py`):** This class also runs in a separate `QThread`. It is responsible for generating the detailed file classification previews displayed in the preview table. It calls methods on the `AssetProcessor` (`get_detailed_file_predictions`) to perform the analysis in the background. It uses a `ThreadPoolExecutor` for potentially concurrent prediction tasks. Results are sent back to the `MainWindow` via Qt signals to update the preview table data. This handler works with data structures containing file prediction details.
## Communication (Signals and Slots)
@ -54,6 +56,17 @@ The `data()` method of the `PreviewTableModel` then accesses this structured `se
The `PreviewSortFilterProxyModel` operates on this structured data, implementing a multi-level sort based on source asset, row type (main vs. additional-only), and file paths within those types.
* Calls methods on the `AssetProcessor` (specifically `get_detailed_file_predictions`) to perform the analysis in the background. It uses a `ThreadPoolExecutor` for potentially concurrent prediction tasks. Results are sent back to the `MainWindow` via Qt signals to update the preview table data. This handler works with data structures containing file prediction details. It is also responsible for generating the initial `SourceRule` hierarchy based on the input files and the selected preset, and emitting a signal to provide this `SourceRule` object to the `MainWindow`.
## Rule Hierarchy UI (`gui/rule_hierarchy_model.py`, `gui/rule_editor_widget.py`)
The GUI now includes dedicated components for visualizing and editing the hierarchical processing rules:
* **`Rule Hierarchy Model` (`gui/rule_hierarchy_model.py`):** This class implements `QAbstractItemModel` to expose the structure of a `SourceRule` object (Source -> Asset -> File) to a `QTreeView`. It allows the `QTreeView` to display the hierarchy and enables user selection of individual rule objects (SourceRule, AssetRule, or FileRule).
* **`Rule Editor Widget` (`gui/rule_editor_widget.py`):** This custom widget provides a dynamic form for editing the attributes of the currently selected rule object from the hierarchy tree. When a rule object is selected in the `QTreeView`, the `MainWindow` updates the `RuleEditorWidget` to display and allow modification of that object's specific parameters.
These components are integrated into the `MainWindow`. The `PredictionHandler` generates the initial `SourceRule` hierarchy, which is then set on the `RuleHierarchyModel`. The `QTreeView` displays this model, and selection changes in the tree trigger updates to the `RuleEditorWidget`, allowing users to interactively modify the dynamic rules before processing. Edits made in the `RuleEditorWidget` directly modify the attributes of the underlying rule objects in the `SourceRule` hierarchy.
### Preview Table Column Configuration
The display and behavior of the columns in the `QTableView` are configured in `gui/main_window.py`. The current configuration is as follows:

View File

@ -13,6 +13,8 @@ from pathlib import Path
from fnmatch import fnmatch # For pattern matching like *.fbx, *_Preview*
from typing import List, Dict, Tuple, Optional # Added for type hinting
from collections import defaultdict # Added for grouping
from rule_structure import SourceRule # Import SourceRule
# Attempt to import archive libraries
try:
@ -223,6 +225,27 @@ class AssetProcessor:
log.debug(f"AssetProcessor initialized for: {self.input_path.name}")
# --- Helper Method: Get Rule with Fallback ---
def _get_rule_with_fallback(self, rules: SourceRule, rule_key: str, file_path: Path | None = None, asset_name: str | None = None, default=None):
"""
Retrieves a rule value using hierarchical fallback logic:
File-specific > Asset-specific > Source/General Rules > Config Default.
"""
# Prioritize File > Asset > Source > Config
if file_path and rule_key in rules.file_rules.get(str(file_path), {}):
log.debug(f"Rule '{rule_key}' found at File level for '{file_path}'.")
return rules.file_rules[str(file_path)][rule_key]
if asset_name and rule_key in rules.asset_rules.get(asset_name, {}):
log.debug(f"Rule '{rule_key}' found at Asset level for '{asset_name}'.")
return rules.asset_rules[asset_name][rule_key]
if rule_key in rules.high_level_sorting_parameters:
log.debug(f"Rule '{rule_key}' found at Source level.")
return rules.high_level_sorting_parameters[rule_key]
# Fallback to config
log.debug(f"Rule '{rule_key}' not found in rules, falling back to config default.")
return getattr(self.config, rule_key, default)
# --- New Helper Function: Load and Transform Source ---
def _load_and_transform_source(self, source_path_rel: Path, map_type: str, target_resolution_key: str, is_gloss_source: bool, cache: dict) -> Tuple[Optional[np.ndarray], Optional[np.dtype]]:
"""
@ -379,7 +402,11 @@ class AssetProcessor:
# --- New Helper Function: Save Image ---
def _save_image(self, image_data: np.ndarray, map_type: str, resolution_key: str, asset_base_name: str, source_info: dict, output_bit_depth_rule: str, temp_dir: Path) -> Optional[Dict]:
def _save_image(self, image_data: np.ndarray, map_type: str, resolution_key: str, asset_base_name: str, source_info: dict, output_bit_depth_rule: str, temp_dir: Path,
# Parameters passed down from calling methods using _get_rule_with_fallback
output_formats_16bit: Tuple[str, str], output_format_8bit: str, resolution_threshold_for_jpg: int,
force_lossless_map_types: List[str], jpg_quality: int, png_compression_level: int,
target_filename_pattern: str, image_resolutions: Dict[str, int]) -> Optional[Dict]:
"""
Handles saving an image NumPy array to a temporary file, including determining
format, bit depth, performing final conversions, and fallback logic.
@ -393,6 +420,14 @@ class AssetProcessor:
{'original_extension': '.tif', 'source_bit_depth': 16, 'involved_extensions': {'.tif', '.png'}}
output_bit_depth_rule: Rule for determining output bit depth ('respect', 'force_8bit', 'force_16bit', 'respect_inputs').
temp_dir: The temporary directory path to save the file in.
output_formats_16bit: Tuple of primary and fallback 16-bit formats (e.g., ('exr', 'png')).
output_format_8bit: Default 8-bit format (e.g., 'jpg').
resolution_threshold_for_jpg: Threshold above which 8-bit images become JPG.
force_lossless_map_types: List of map types to always save losslessly.
jpg_quality: Quality setting for JPG saves.
png_compression_level: Compression level for PNG saves.
target_filename_pattern: Format string for output filenames.
image_resolutions: Dictionary mapping resolution keys to pixel dimensions.
Returns:
A dictionary containing details of the saved file (path, width, height,
@ -433,13 +468,14 @@ class AssetProcessor:
# --- 2. Determine Output Format ---
output_format, output_ext, save_params, needs_float16 = "", "", [], False
primary_fmt_16, fallback_fmt_16 = self.config.get_16bit_output_formats()
fmt_8bit_config = self.config.get_8bit_output_format()
threshold = self.config.resolution_threshold_for_jpg
force_lossless = map_type in self.config.force_lossless_map_types
# Use passed-in parameters instead of self.config
primary_fmt_16, fallback_fmt_16 = output_formats_16bit
fmt_8bit_config = output_format_8bit
threshold = resolution_threshold_for_jpg
force_lossless = map_type in force_lossless_map_types
original_extension = source_info.get('original_extension', '.png') # Primary source ext
involved_extensions = source_info.get('involved_extensions', {original_extension}) # For merges
target_dim_px = self.config.image_resolutions.get(resolution_key, 0) # Get target dimension size
target_dim_px = image_resolutions.get(resolution_key, 0) # Get target dimension size
# Apply format determination logic (similar to old _process_maps/_merge_maps)
if force_lossless:
@ -453,45 +489,46 @@ class AssetProcessor:
if output_format != "png": log.warning(f"Primary 16-bit format '{output_format}' not PNG/EXR for forced lossless. Using fallback '{fallback_fmt_16}'.")
output_format = fallback_fmt_16 if fallback_fmt_16 == "png" else "png" # Ensure PNG
output_ext = ".png"
png_level = self.config._core_settings.get('PNG_COMPRESSION_LEVEL', 6)
save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, png_level])
# Use passed-in parameter
save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, png_compression_level])
else: # 8-bit lossless -> PNG
output_format = "png"; output_ext = ".png"
png_level = self.config._core_settings.get('PNG_COMPRESSION_LEVEL', 6)
save_params = [cv2.IMWRITE_PNG_COMPRESSION, png_level]
# Use passed-in parameter
save_params = [cv2.IMWRITE_PNG_COMPRESSION, png_compression_level]
elif output_bit_depth == 8 and target_dim_px >= threshold:
output_format = 'jpg'; output_ext = '.jpg'
jpg_quality = self.config.jpg_quality
# Use passed-in parameter
save_params.extend([cv2.IMWRITE_JPEG_QUALITY, jpg_quality])
log.debug(f"Using JPG format (Quality: {jpg_quality}) for {map_type} at {resolution_key} due to resolution threshold ({target_dim_px} >= {threshold}).")
else:
# Determine highest format involved (for merges) or use original (for individuals)
highest_format_str = 'jpg' # Default lowest
relevant_extensions = involved_extensions if map_type in self.config.map_merge_rules else {original_extension}
# Check against passed-in map_merge_rules if needed, but currently not used here
relevant_extensions = involved_extensions # Use involved_extensions directly
if '.exr' in relevant_extensions: highest_format_str = 'exr'
elif '.tif' in relevant_extensions: highest_format_str = 'tif'
elif '.png' in relevant_extensions: highest_format_str = 'png'
if highest_format_str == 'exr':
if output_bit_depth == 16: output_format, output_ext, needs_float16 = "exr", ".exr", True; save_params.extend([cv2.IMWRITE_EXR_TYPE, cv2.IMWRITE_EXR_TYPE_HALF])
else: output_format, output_ext = "png", ".png"; save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, self.config._core_settings.get('PNG_COMPRESSION_LEVEL', 6)])
else: output_format, output_ext = "png", ".png"; save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, png_compression_level]) # Use param
elif highest_format_str == 'tif':
if output_bit_depth == 16:
output_format = primary_fmt_16
if output_format.startswith("exr"): output_ext, needs_float16 = ".exr", True; save_params.extend([cv2.IMWRITE_EXR_TYPE, cv2.IMWRITE_EXR_TYPE_HALF])
else: output_format = "png"; output_ext = ".png"; save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, self.config._core_settings.get('PNG_COMPRESSION_LEVEL', 6)])
else: output_format, output_ext = "png", ".png"; save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, self.config._core_settings.get('PNG_COMPRESSION_LEVEL', 6)])
else: output_format = "png"; output_ext = ".png"; save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, png_compression_level]) # Use param
else: output_format, output_ext = "png", ".png"; save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, png_compression_level]) # Use param
elif highest_format_str == 'png':
if output_bit_depth == 16:
output_format = primary_fmt_16
if output_format.startswith("exr"): output_ext, needs_float16 = ".exr", True; save_params.extend([cv2.IMWRITE_EXR_TYPE, cv2.IMWRITE_EXR_TYPE_HALF])
else: output_format = "png"; output_ext = ".png"; save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, self.config._core_settings.get('PNG_COMPRESSION_LEVEL', 6)])
else: output_format, output_ext = "png", ".png"; save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, self.config._core_settings.get('PNG_COMPRESSION_LEVEL', 6)])
else: output_format = "png"; output_ext = ".png"; save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, png_compression_level]) # Use param
else: output_format, output_ext = "png", ".png"; save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, png_compression_level]) # Use param
else: # Default to configured 8-bit format if highest was JPG or unknown
output_format = fmt_8bit_config; output_ext = f".{output_format}"
if output_format == "png": save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, self.config._core_settings.get('PNG_COMPRESSION_LEVEL', 6)])
elif output_format == "jpg": save_params.extend([cv2.IMWRITE_JPEG_QUALITY, self.config.jpg_quality])
if output_format == "png": save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, png_compression_level]) # Use param
elif output_format == "jpg": save_params.extend([cv2.IMWRITE_JPEG_QUALITY, jpg_quality]) # Use param
# Final check: JPG must be 8-bit
if output_format == "jpg" and output_bit_depth == 16:
@ -531,7 +568,8 @@ class AssetProcessor:
img_save_final = img_to_save # Fallback
# --- 5. Construct Filename & Save ---
filename = self.config.target_filename_pattern.format(
# Use passed-in parameter
filename = target_filename_pattern.format(
base_name=asset_base_name,
map_type=map_type,
resolution=resolution_key,
@ -552,9 +590,11 @@ class AssetProcessor:
if output_bit_depth == 16 and output_format.startswith("exr") and fallback_fmt_16 != output_format and fallback_fmt_16 == "png":
log.warning(f"Attempting fallback PNG save for {map_type} {resolution_key}")
actual_format_saved = "png"; output_ext = ".png";
filename = self.config.target_filename_pattern.format(base_name=asset_base_name, map_type=map_type, resolution=resolution_key, ext="png")
# Use passed-in parameter
filename = target_filename_pattern.format(base_name=asset_base_name, map_type=map_type, resolution=resolution_key, ext="png")
output_path_temp = temp_dir / filename
save_params_fallback = [cv2.IMWRITE_PNG_COMPRESSION, self.config._core_settings.get('PNG_COMPRESSION_LEVEL', 6)]
# Use passed-in parameter
save_params_fallback = [cv2.IMWRITE_PNG_COMPRESSION, png_compression_level]
img_fallback = None; target_fallback_dtype = np.uint16
# Convert original data (before float16 conversion) to uint16 for PNG fallback
@ -600,7 +640,7 @@ class AssetProcessor:
log.error(f"Unexpected error in _save_image for {map_type} ({resolution_key}): {e}", exc_info=True)
return None
def process(self) -> Dict[str, List[str]]:
def process(self, rules: SourceRule) -> Dict[str, List[str]]:
"""
Executes the full processing pipeline for the input path, handling
multiple assets within a single input if detected.
@ -620,10 +660,11 @@ class AssetProcessor:
try:
self._setup_workspace()
self._extract_input()
self._inventory_and_classify_files() # Classifies all files in self.classified_files
# Pass rules to classification
self._inventory_and_classify_files(rules) # Classifies all files in self.classified_files
# Determine distinct assets and file mapping
distinct_base_names, file_to_base_name_map = self._determine_base_metadata()
# Determine distinct assets and file mapping, pass rules
distinct_base_names, file_to_base_name_map = self._determine_base_metadata(rules)
unmatched_files_paths = [p for p, name in file_to_base_name_map.items() if name is None]
if unmatched_files_paths:
log.warning(f"Found {len(unmatched_files_paths)} files not matched to any specific asset base name. They will be copied to each asset's Extra folder.")
@ -669,11 +710,12 @@ class AssetProcessor:
log.debug(f" Updating '{map_info['source_path']}' map_type from '{map_info['map_type']}' to '{final_map_type}'")
map_info['map_type'] = final_map_type # Update the map_type in the dictionary
# --- Determine Metadata for this specific asset ---
asset_specific_metadata = self._determine_single_asset_metadata(current_asset_name, filtered_classified_files_asset)
# --- Determine Metadata for this specific asset, pass rules ---
asset_specific_metadata = self._determine_single_asset_metadata(current_asset_name, filtered_classified_files_asset, rules)
current_asset_metadata = {
"asset_name": current_asset_name,
"supplier_name": supplier_name,
# Supplier name is determined by rules/config in _determine_single_asset_metadata
"supplier_name": asset_specific_metadata.get("supplier_name", supplier_name), # Use determined supplier name
"asset_category": asset_specific_metadata.get("asset_category", self.config.default_asset_category),
"archetype": asset_specific_metadata.get("archetype", "Unknown"),
# Initialize fields that will be populated by processing steps
@ -700,11 +742,12 @@ class AssetProcessor:
elif self.overwrite:
log.info(f"Overwrite flag is set. Processing asset '{current_asset_name}' even if output exists.")
# --- Process Individual Maps for this asset ---
# --- Process Individual Maps for this asset, pass rules ---
processed_maps_details_asset, image_stats_asset, aspect_ratio_change_string_asset, ignored_rough_maps = self._process_individual_maps(
filtered_maps_list=filtered_classified_files_asset.get('maps', []),
current_asset_metadata=current_asset_metadata, # Pass base metadata
loaded_data_cache=loaded_data_cache # Pass cache
loaded_data_cache=loaded_data_cache, # Pass cache
rules=rules # Pass rules
)
# Update current metadata with results
current_asset_metadata["image_stats_1k"] = image_stats_asset
@ -717,22 +760,31 @@ class AssetProcessor:
map_details_asset = {k: v for k, v in current_asset_metadata.pop("map_details", {}).items() if k in processed_maps_details_asset}
# --- Merge Maps from Source for this asset ---
# --- Merge Maps from Source for this asset, pass rules ---
merged_maps_details_asset = self._merge_maps_from_source(
processed_maps_details_asset=processed_maps_details_asset, # Still needed for source info lookup? Or pass classified files? Check impl.
filtered_classified_files=filtered_classified_files_asset,
current_asset_metadata=current_asset_metadata,
loaded_data_cache=loaded_data_cache # Pass cache
loaded_data_cache=loaded_data_cache, # Pass cache
rules=rules # Pass rules
)
# --- Generate Metadata for this asset ---
# --- Generate Metadata for this asset, pass rules ---
# Get config values using fallback for _generate_metadata_file
preset_name_rule = self._get_rule_with_fallback(rules, 'preset_name', asset_name=current_asset_name, default=self.config.preset_name)
metadata_filename_rule = self._get_rule_with_fallback(rules, 'metadata_filename', asset_name=current_asset_name, default=self.config.metadata_filename)
temp_metadata_path_asset = self._generate_metadata_file(
current_asset_metadata=current_asset_metadata, # Pass the populated dict
processed_maps_details_asset=processed_maps_details_asset,
merged_maps_details_asset=merged_maps_details_asset,
filtered_classified_files_asset=filtered_classified_files_asset,
unmatched_files_paths=unmatched_files_paths, # Pass the list of unmatched files
map_details_asset=map_details_asset # Pass the filtered map details
map_details_asset=map_details_asset, # Pass the filtered map details
rules=rules, # Pass rules
# Pass retrieved config values
preset_name=preset_name_rule,
metadata_filename=metadata_filename_rule
)
# --- Organize Output Files for this asset ---
@ -880,14 +932,17 @@ class AssetProcessor:
# Wrap any other unexpected exceptions
raise AssetProcessingError(f"An unexpected error occurred during input extraction for {self.input_path.name}: {e}") from e
def _inventory_and_classify_files(self):
def _inventory_and_classify_files(self, rules: SourceRule):
"""
Scans workspace, classifies files according to preset rules, handling
16-bit prioritization and multiple variants of the same base map type.
Scans workspace, classifies files according to hierarchical rules and preset rules,
handling 16-bit prioritization and multiple variants of the same base map type.
"""
if not self.temp_dir:
raise AssetProcessingError("Temporary workspace not setup before inventory.")
log.info("Scanning and classifying files...")
log.debug("--- Starting File Inventory and Classification (v2) ---")
all_files_rel = []
@ -907,10 +962,13 @@ class AssetProcessor:
self.classified_files = {"maps": [], "models": [], "extra": [], "ignored": []}
# --- Step 1: Identify Explicit 'Extra' Files ---
log.debug("Step 1: Checking for files to move to 'Extra' (using regex)...")
compiled_extra_regex = getattr(self.config, 'compiled_extra_regex', [])
log.debug(f" Compiled 'Extra' regex patterns: {[r.pattern for r in compiled_extra_regex]}")
# Get extra regex patterns using fallback logic
extra_patterns = self._get_rule_with_fallback(rules, 'extra_file_patterns', default=[])
compiled_extra_regex = [re.compile(p, re.IGNORECASE) for p in extra_patterns] # Compile with ignore case
log.debug(f" 'Extra' regex patterns (from rules/config): {[p for p in extra_patterns]}")
for file_rel_path in all_files_rel:
if file_rel_path in processed_files: continue
for compiled_regex in compiled_extra_regex:
@ -921,10 +979,13 @@ class AssetProcessor:
log.debug(f" Added '{file_rel_path}' to processed files.")
break # Stop checking extra patterns for this file
# --- Step 2: Identify Model Files ---
log.debug("Step 2: Identifying model files (using regex)...")
compiled_model_regex = getattr(self.config, 'compiled_model_regex', [])
log.debug(f" Compiled 'Model' regex patterns: {[r.pattern for r in compiled_model_regex]}")
# Get model patterns using fallback logic
model_patterns = self._get_rule_with_fallback(rules, 'model_file_patterns', default=[])
compiled_model_regex = [re.compile(p, re.IGNORECASE) for p in model_patterns] # Compile with ignore case
log.debug(f" 'Model' regex patterns (from rules/config): {[p for p in model_patterns]}")
for file_rel_path in all_files_rel:
if file_rel_path in processed_files: continue
for compiled_regex in compiled_model_regex:
@ -935,10 +996,28 @@ class AssetProcessor:
log.debug(f" Added '{file_rel_path}' to processed files.")
break # Stop checking model patterns for this file
# --- Step 3: Gather Potential Map Candidates (Refactored) ---
# --- Step 3: Gather Potential Map Candidates ---
log.debug("Step 3: Gathering potential map candidates (iterating files first)...")
# Compiled map keyword regex now maps: base_type -> [(regex, keyword, rule_index), ...]
compiled_map_keyword_regex_tuples = getattr(self.config, 'compiled_map_keyword_regex', {})
# Get map type mapping rules using fallback logic
map_type_mapping_rules = self._get_rule_with_fallback(rules, 'map_type_mapping', default=[])
# Compile regex patterns from the rules (similar to config initialization)
compiled_map_keyword_regex_tuples = defaultdict(list)
for rule_index, rule in enumerate(map_type_mapping_rules):
target_type = rule.get("target_type")
keywords = rule.get("keywords", [])
if target_type and keywords:
for keyword in keywords:
# Escape special regex characters in the keyword, then replace '*' with '.*'
pattern = re.escape(keyword).replace(re.escape('*'), '.*')
# Anchor the pattern to match the whole stem or parts separated by the naming separator
# This is a simplified approach; a more robust one might involve tokenizing the stem
# For now, check if the pattern exists as a whole word or part of a path segment
# Let's use a simple contains check for prediction/initial classification
compiled_map_keyword_regex_tuples[target_type].append((re.compile(pattern, re.IGNORECASE), keyword, rule_index))
log.debug(f" 'Map Type Mapping' rules (from rules/config): {map_type_mapping_rules}")
for file_rel_path in all_files_rel:
# Skip files already classified as Extra or Model
@ -952,21 +1031,20 @@ class AssetProcessor:
for base_map_type, regex_tuples in compiled_map_keyword_regex_tuples.items():
if match_found: break # Stop checking types for this file once matched
# Get the original keywords list for the current rule index
# Assuming self.config.map_type_mapping holds the original list of dicts from JSON
# Get the original keywords list for the current rule index from the *applied* rules
original_rule = None
# Find the rule based on the first tuple's rule_index (they should all be the same for this base_map_type)
if regex_tuples:
current_rule_index = regex_tuples[0][2] # Get rule_index from the first tuple
if hasattr(self.config, 'map_type_mapping') and current_rule_index < len(self.config.map_type_mapping):
rule_candidate = self.config.map_type_mapping[current_rule_index]
if current_rule_index < len(map_type_mapping_rules):
rule_candidate = map_type_mapping_rules[current_rule_index]
# Verify it's the correct rule by checking target_type
if rule_candidate.get("target_type") == base_map_type:
original_rule = rule_candidate
else:
log.warning(f"Rule index mismatch for {base_map_type} at index {current_rule_index}. Searching...")
log.warning(f"Rule index mismatch for {base_map_type} at index {current_rule_index} in applied rules. Searching...")
# Fallback search if index doesn't match (shouldn't happen ideally)
for idx, rule in enumerate(self.config.map_type_mapping):
for idx, rule in enumerate(map_type_mapping_rules):
if rule.get("target_type") == base_map_type:
original_rule = rule
log.warning(f"Found rule for {base_map_type} at index {idx} instead.")
@ -1013,7 +1091,13 @@ class AssetProcessor:
# --- Step 3.5: Identify Standalone 16-bit Variants (Not caught by keywords) ---
log.debug("Step 3.5: Checking for standalone 16-bit variants...")
compiled_bit_depth_regex = getattr(self.config, 'compiled_bit_depth_regex_map', {})
# Get bit depth regex map using fallback logic
bit_depth_patterns = self._get_rule_with_fallback(rules, 'bit_depth_patterns', default={})
compiled_bit_depth_regex = {
base_type: re.compile(pattern, re.IGNORECASE)
for base_type, pattern in bit_depth_patterns.items()
}
log.debug(f" 'Bit Depth' regex patterns (from rules/config): {bit_depth_patterns}")
for file_rel_path in all_files_rel:
# Skip if already processed or already identified as a candidate
if file_rel_path in processed_files or any(c['source_path'] == file_rel_path for c in potential_map_candidates):
@ -1037,9 +1121,10 @@ class AssetProcessor:
log.debug(f"Total potential map candidates after checking standalone 16-bit: {len(potential_map_candidates)}")
# --- Step 4: Prioritize 16-bit Variants & Filter Candidates ---
log.debug("Step 4: Prioritizing 16-bit variants and filtering candidates...")
compiled_bit_depth_regex = getattr(self.config, 'compiled_bit_depth_regex_map', {})
# Use the compiled bit depth regex from Step 3.5
candidates_to_keep = []
candidates_to_ignore = [] # Store 8-bit versions superseded by 16-bit
@ -1152,11 +1237,14 @@ class AssetProcessor:
log.debug("--- End File Inventory and Classification (v2) ---")
def _determine_base_metadata(self) -> Tuple[List[str], Dict[Path, Optional[str]]]:
def _determine_base_metadata(self, rules: SourceRule) -> Tuple[List[str], Dict[Path, Optional[str]]]:
"""
Determines distinct asset base names within the input based on preset rules
and maps each relevant source file to its determined base name.
Args:
rules: The hierarchical rules object.
Returns:
Tuple[List[str], Dict[Path, Optional[str]]]:
- A list of unique, sanitized base names found.
@ -1180,15 +1268,17 @@ class AssetProcessor:
return [sanitized_input_name], file_to_base_name_map
# --- Determine Base Names from Files ---
separator = self.config.source_naming_separator
indices_dict = self.config.source_naming_indices
# Get naming rules using fallback logic
naming_rules = self._get_rule_with_fallback(rules, 'source_naming_rules', default={})
separator = naming_rules.get('separator', self.config.source_naming_separator) # Fallback to config if not in rules
indices_dict = naming_rules.get('indices', self.config.source_naming_indices) # Fallback to config if not in rules
base_index_raw = indices_dict.get('base_name')
base_index = None
if base_index_raw is not None:
try:
base_index = int(base_index_raw)
except (ValueError, TypeError):
log.warning(f"Could not convert base_name index '{base_index_raw}' to integer. Base name determination might be inaccurate.")
log.warning(f"Could not convert base_name index '{base_index_raw}' to integer (from rules/config). Base name determination might be inaccurate.")
file_to_base_name_map: Dict[Path, Optional[str]] = {}
potential_base_names_per_file: Dict[Path, str] = {} # Store potential name for each file path
@ -1256,10 +1346,10 @@ class AssetProcessor:
return distinct_base_names, file_to_base_name_map
def _determine_single_asset_metadata(self, asset_base_name: str, filtered_classified_files: Dict[str, List[Dict]]) -> Dict[str, str]:
def _determine_single_asset_metadata(self, asset_base_name: str, filtered_classified_files: Dict[str, List[Dict]], rules: SourceRule) -> Dict[str, str]:
"""
Determines the asset_category and archetype for a single, specific asset
based on its filtered list of classified files.
Determines the asset_category, archetype, and supplier name for a single,
specific asset based on its filtered list of classified files and hierarchical rules.
Args:
asset_base_name: The determined base name for this specific asset.
@ -1269,8 +1359,14 @@ class AssetProcessor:
Returns:
A dictionary containing {"asset_category": str, "archetype": str}.
"""
log.debug(f"Determining category and archetype for asset: '{asset_base_name}'")
determined_category = self.config.default_asset_category # Start with default
log.debug(f"Determining category, archetype, and supplier for asset: '{asset_base_name}'")
# Determine Supplier Name using fallback
determined_supplier_name = self._get_rule_with_fallback('supplier_name', asset_name=asset_base_name, default=self.config.supplier_name)
log.debug(f" Determined Supplier Name for '{asset_base_name}': {determined_supplier_name}")
determined_category = self._get_rule_with_fallback('default_asset_category', asset_name=asset_base_name, default=self.config.default_asset_category) # Start with default from rules/config
determined_archetype = "Unknown"
# --- Determine Asset Category ---
@ -1279,7 +1375,8 @@ class AssetProcessor:
log.debug(f" Category set to 'Asset' for '{asset_base_name}' due to model file presence.")
else:
# Check for Decal keywords only if not an Asset
decal_keywords = self.config.asset_category_rules.get('decal_keywords', [])
# Get decal keywords using fallback
decal_keywords = self._get_rule_with_fallback('decal_keywords', asset_name=asset_base_name, default=[])
found_decal = False
# Check map names first for decal keywords
candidate_files = [f['source_path'] for f in filtered_classified_files.get('maps', [])]
@ -1300,8 +1397,10 @@ class AssetProcessor:
log.debug(f" Determined Category for '{asset_base_name}': {determined_category}")
# --- Determine Archetype (Usage) ---
archetype_rules = self.config.archetype_rules
# Get archetype rules using fallback
archetype_rules = self._get_rule_with_fallback('archetype_rules', asset_name=asset_base_name, default=[])
# Use stems from maps and models belonging *only* to this asset
check_stems = [f['source_path'].stem.lower() for f in filtered_classified_files.get('maps', [])]
check_stems.extend([f['source_path'].stem.lower() for f in filtered_classified_files.get('models', [])])
@ -1334,13 +1433,15 @@ class AssetProcessor:
log.debug(f" Determined Archetype for '{asset_base_name}': {determined_archetype}")
return {"asset_category": determined_category, "archetype": determined_archetype}
return {"asset_category": determined_category, "archetype": determined_archetype, "supplier_name": determined_supplier_name}
def _process_individual_maps(self, filtered_maps_list: List[Dict], current_asset_metadata: Dict, loaded_data_cache: dict) -> Tuple[Dict[str, Dict[str, Dict]], Dict[str, Dict], str, List[Dict]]:
def _process_individual_maps(self, filtered_maps_list: List[Dict], current_asset_metadata: Dict, loaded_data_cache: dict, rules: SourceRule) -> Tuple[Dict[str, Dict[str, Dict]], Dict[str, Dict], str, List[Dict]]:
"""
Processes, resizes, and saves classified map files for a specific asset
that are NOT used as inputs for merge rules. Uses helper functions.
that are NOT used as inputs for merge rules, applying hierarchical rules.
Uses helper functions.
Args:
filtered_maps_list: List of map dictionaries belonging to the current asset.
@ -1366,13 +1467,13 @@ class AssetProcessor:
aspect_ratio_change_string_asset: str = "N/A"
ignored_rough_maps: List[Dict] = [] # Store ignored native rough maps
# --- Settings retrieval ---
resolutions = self.config.image_resolutions
stats_res_key = self.config.calculate_stats_resolution
# --- Settings retrieval using fallback ---
resolutions = self._get_rule_with_fallback('image_resolutions', asset_name=asset_name, default=self.config.image_resolutions)
stats_res_key = self._get_rule_with_fallback('calculate_stats_resolution', asset_name=asset_name, default=self.config.calculate_stats_resolution)
stats_target_dim = resolutions.get(stats_res_key)
if not stats_target_dim: log.warning(f"Stats resolution key '{stats_res_key}' not found. Stats skipped for '{asset_name}'.")
gloss_keywords = self.config.source_glossiness_keywords
# target_pattern = self.config.target_filename_pattern # Not needed here, handled by _save_image
if not stats_target_dim: log.warning(f"Stats resolution key '{stats_res_key}' not found in rules/config. Stats skipped for '{asset_name}'.")
gloss_keywords = self._get_rule_with_fallback('source_glossiness_keywords', asset_name=asset_name, default=self.config.source_glossiness_keywords)
# target_pattern = _get_rule_with_fallback('target_filename_pattern', asset_name=asset_name, default=self.config.target_filename_pattern) # Not needed here, handled by _save_image
base_name = asset_name # Use the asset name passed in
# --- Pre-process Glossiness -> Roughness ---
@ -1403,15 +1504,16 @@ class AssetProcessor:
rough_source_to_use_info = native_rough_map_info
derived_from_gloss_flag['ROUGH'] = False
# --- Identify maps used in merge rules ---
# --- Identify maps used in merge rules (using fallback) ---
merge_rules = self._get_rule_with_fallback('map_merge_rules', asset_name=asset_name, default=self.config.map_merge_rules)
merge_input_map_types = set()
for rule in self.config.map_merge_rules:
for rule in merge_rules:
inputs_mapping = rule.get("inputs", {})
for source_map_type in inputs_mapping.values():
# Use the base type for checking against merge rules
base_type = _get_base_map_type(source_map_type)
merge_input_map_types.add(base_type)
log.debug(f"Map types used as input for merge rules: {merge_input_map_types}")
log.debug(f"Map types used as input for merge rules (from rules/config): {merge_input_map_types}")
# --- Filter maps to process individually ---
maps_to_process_individually = []
@ -1419,7 +1521,7 @@ class AssetProcessor:
base_map_type = _get_base_map_type(map_info['map_type'])
# Skip if this base map type is used in *any* merge rule input
if base_map_type in merge_input_map_types:
log.debug(f"Skipping individual processing for {map_info['map_type']} ({map_info['source_path']}) as its base type '{base_map_type}' is used in merge rules.")
log.debug(f"Skipping individual processing for {map_info['map_type']} ({map_info['source_path']}) as its base type '{base_map_type}' is used in merge rules (from rules/config).")
continue
# Skip native rough map if gloss was prioritized
if map_info['map_type'].startswith('ROUGH') and any(ignored['source_path'] == map_info['source_path'] for ignored in ignored_rough_maps):
@ -1519,7 +1621,19 @@ class AssetProcessor:
'source_bit_depth': source_bit_depth_found or 8, # Use found depth or default
'involved_extensions': {original_extension} # Only self for individual maps
}
bit_depth_rule = self.config.get_bit_depth_rule(map_type) # Get rule for this specific map type
# Get bit depth rule using fallback logic
bit_depth_rules_map = self._get_rule_with_fallback('output_bit_depth_rules', file_path=source_path_rel, asset_name=asset_name, default={})
bit_depth_rule = bit_depth_rules_map.get(map_type, 'respect') # Default to 'respect' if map type not in rules/config
# Get additional config values using fallback for _save_image
output_formats_16bit = self._get_rule_with_fallback('get_16bit_output_formats', file_path=source_path_rel, asset_name=asset_name, default=self.config.get_16bit_output_formats())
output_format_8bit = self._get_rule_with_fallback('get_8bit_output_format', file_path=source_path_rel, asset_name=asset_name, default=self.config.get_8bit_output_format())
resolution_threshold_for_jpg = self._get_rule_with_fallback('resolution_threshold_for_jpg', file_path=source_path_rel, asset_name=asset_name, default=self.config.resolution_threshold_for_jpg)
force_lossless_map_types = self._get_rule_with_fallback('force_lossless_map_types', asset_name=asset_name, default=self.config.force_lossless_map_types) # This rule applies to map type, not individual file path
jpg_quality = self._get_rule_with_fallback('jpg_quality', file_path=source_path_rel, asset_name=asset_name, default=self.config.jpg_quality)
png_compression_level = self._get_rule_with_fallback('_core_settings', asset_name=asset_name, default=self.config._core_settings).get('PNG_COMPRESSION_LEVEL', 6) # This rule applies broadly, not per file
target_filename_pattern = self._get_rule_with_fallback('target_filename_pattern', file_path=source_path_rel, asset_name=asset_name, default=self.config.target_filename_pattern)
# image_resolutions is already retrieved at the start of the method
save_result = self._save_image(
image_data=img_resized,
@ -1528,7 +1642,16 @@ class AssetProcessor:
asset_base_name=base_name,
source_info=source_info,
output_bit_depth_rule=bit_depth_rule,
temp_dir=self.temp_dir
temp_dir=self.temp_dir,
# Pass retrieved config values
output_formats_16bit=output_formats_16bit,
output_format_8bit=output_format_8bit,
resolution_threshold_for_jpg=resolution_threshold_for_jpg,
force_lossless_map_types=force_lossless_map_types,
jpg_quality=jpg_quality,
png_compression_level=png_compression_level,
target_filename_pattern=target_filename_pattern,
image_resolutions=resolutions # Pass the already retrieved resolutions dict
)
# --- 5. Store Result ---
@ -1557,7 +1680,9 @@ class AssetProcessor:
return processed_maps_details_asset, image_stats_asset, aspect_ratio_change_string_asset, ignored_rough_maps
def _merge_maps_from_source(self, processed_maps_details_asset: Dict[str, Dict[str, Dict]], filtered_classified_files: Dict[str, List[Dict]], current_asset_metadata: Dict, loaded_data_cache: dict) -> Dict[str, Dict[str, Dict]]:
def _merge_maps_from_source(self, processed_maps_details_asset: Dict[str, Dict[str, Dict]], filtered_classified_files: Dict[str, List[Dict]], current_asset_metadata: Dict, loaded_data_cache: dict, rules: SourceRule) -> Dict[str, Dict[str, Dict]]:
"""
Merges channels from different SOURCE maps for a specific asset based on rules
in configuration, using helper functions for loading and saving.
@ -1573,10 +1698,12 @@ class AssetProcessor:
"""
if not self.temp_dir: raise AssetProcessingError("Workspace not setup.")
asset_name = current_asset_metadata.get("asset_name", "UnknownAsset")
gloss_keywords = self.config.source_glossiness_keywords # Get gloss keywords
# Get gloss keywords using fallback logic
gloss_keywords = self._get_rule_with_fallback('source_glossiness_keywords', asset_name=asset_name, default=self.config.source_glossiness_keywords)
merge_rules = self.config.map_merge_rules
log.info(f"Asset '{asset_name}': Applying {len(merge_rules)} map merging rule(s) from source...")
# Get merge rules using fallback logic
merge_rules = self._get_rule_with_fallback('map_merge_rules', asset_name=asset_name, default=self.config.map_merge_rules)
log.info(f"Asset '{asset_name}': Applying {len(merge_rules)} map merging rule(s) from source (from rules/config)...")
# Initialize results for this asset
merged_maps_details_asset: Dict[str, Dict[str, Dict]] = defaultdict(dict)
@ -1637,6 +1764,9 @@ class AssetProcessor:
# --- Determine common resolutions based on *processed* maps (as a proxy for available sizes) ---
# This assumes _process_individual_maps ran first and populated processed_maps_details_asset
possible_resolutions_per_input = []
# Get resolutions using fallback
resolutions = self._get_rule_with_fallback('image_resolutions', asset_name=asset_name, default=self.config.image_resolutions)
for input_type in set(inputs_mapping.values()):
if input_type in processed_maps_details_asset:
res_keys = {res for res, details in processed_maps_details_asset[input_type].items() if isinstance(details, dict) and 'error' not in details}
@ -1652,7 +1782,7 @@ class AssetProcessor:
# We'll load the source at *all* target resolutions and let _load_and_transform_source handle skipping if upscale is needed.
log.debug(f"Input map type '{input_type}' for merge rule '{output_map_type}' might not have been processed individually. Will attempt loading source for all target resolutions.")
# Add all configured resolutions as possibilities for this input
possible_resolutions_per_input.append(set(self.config.image_resolutions.keys()))
possible_resolutions_per_input.append(set(resolutions.keys()))
if not possible_resolutions_per_input:
@ -1794,6 +1924,16 @@ class AssetProcessor:
log.debug(f"Merged channels for '{output_map_type}' ({current_res_key}). Result shape: {merged_image_float32.shape}, dtype: {merged_image_float32.dtype}")
# --- Save Merged Map using Helper ---
# Get additional config values using fallback for _save_image
output_formats_16bit = self._get_rule_with_fallback('get_16bit_output_formats', asset_name=asset_name, default=self.config.get_16bit_output_formats())
output_format_8bit = self._get_rule_with_fallback('get_8bit_output_format', asset_name=asset_name, default=self.config.get_8bit_output_format())
resolution_threshold_for_jpg = self._get_rule_with_fallback('resolution_threshold_for_jpg', asset_name=asset_name, default=self.config.resolution_threshold_for_jpg)
force_lossless_map_types = self._get_rule_with_fallback('force_lossless_map_types', asset_name=asset_name, default=self.config.force_lossless_map_types)
jpg_quality = self._get_rule_with_fallback('jpg_quality', asset_name=asset_name, default=self.config.jpg_quality)
png_compression_level = self._get_rule_with_fallback('_core_settings', asset_name=asset_name, default=self.config._core_settings).get('PNG_COMPRESSION_LEVEL', 6)
target_filename_pattern = self._get_rule_with_fallback('target_filename_pattern', asset_name=asset_name, default=self.config.target_filename_pattern)
# image_resolutions is already retrieved at the start of the method
save_result = self._save_image(
image_data=merged_image_float32, # Pass the merged float32 data
map_type=output_map_type,
@ -1801,7 +1941,16 @@ class AssetProcessor:
asset_base_name=base_name,
source_info=source_info_for_save, # Pass collected source info
output_bit_depth_rule=rule_bit_depth, # Pass the rule's requirement
temp_dir=self.temp_dir
temp_dir=self.temp_dir,
# Pass retrieved config values
output_formats_16bit=output_formats_16bit,
output_format_8bit=output_format_8bit,
resolution_threshold_for_jpg=resolution_threshold_for_jpg,
force_lossless_map_types=force_lossless_map_types,
jpg_quality=jpg_quality,
png_compression_level=png_compression_level,
target_filename_pattern=target_filename_pattern,
image_resolutions=resolutions # Pass the already retrieved resolutions dict
)
# --- Record details locally ---
@ -1822,9 +1971,13 @@ class AssetProcessor:
return merged_maps_details_asset
def _generate_metadata_file(self, current_asset_metadata: Dict, processed_maps_details_asset: Dict[str, Dict[str, Dict]], merged_maps_details_asset: Dict[str, Dict[str, Dict]], filtered_classified_files_asset: Dict[str, List[Dict]], unmatched_files_paths: List[Path], map_details_asset: Dict[str, Dict]) -> Path:
def _generate_metadata_file(self, current_asset_metadata: Dict, processed_maps_details_asset: Dict[str, Dict[str, Dict]], merged_maps_details_asset: Dict[str, Dict[str, Dict]], filtered_classified_files_asset: Dict[str, List[Dict]], unmatched_files_paths: List[Path], map_details_asset: Dict[str, Dict], rules: SourceRule,
# Parameters passed down from calling method using _get_rule_with_fallback
preset_name: str, metadata_filename: str) -> Path:
"""
Gathers metadata for a specific asset and writes it to a temporary JSON file.
Gathers metadata for a specific asset and writes it to a temporary JSON file,
applying hierarchical rules for metadata fields.
Args:
current_asset_metadata: Base metadata for this asset (name, category, archetype, etc.).
@ -1833,6 +1986,9 @@ class AssetProcessor:
filtered_classified_files_asset: Classified files belonging only to this asset.
unmatched_files_paths: List of relative paths for files not matched to any base name.
map_details_asset: Dictionary containing details like source bit depth, gloss inversion per map type.
rules: The hierarchical rules object.
preset_name: The name of the preset being used (retrieved via fallback).
metadata_filename: The standard filename for metadata (retrieved via fallback).
Returns:
@ -1845,9 +2001,15 @@ class AssetProcessor:
asset_name = "UnknownAsset_Metadata" # Fallback for filename
log.info(f"Generating metadata file for asset '{asset_name}'...")
# Start with the base metadata passed in for this asset
final_metadata = current_asset_metadata.copy()
# Apply hierarchical rules for specific metadata fields
final_metadata["supplier_name"] = self._get_rule_with_fallback('supplier_name', asset_name=asset_name, default=final_metadata.get("supplier_name", self.config.supplier_name))
final_metadata["asset_category"] = self._get_rule_with_fallback('default_asset_category', asset_name=asset_name, default=final_metadata.get("asset_category", self.config.default_asset_category))
final_metadata["archetype"] = self._get_rule_with_fallback('archetype', asset_name=asset_name, default=final_metadata.get("archetype", "Unknown")) # Archetype is determined earlier, but allow override
# Populate map details from the specific asset's processing results
# Add merged map channel stats
final_metadata["merged_map_channel_stats"] = current_asset_metadata.get("merged_map_channel_stats", {}) # Get from passed metadata
@ -1897,7 +2059,7 @@ class AssetProcessor:
# Add processing info
final_metadata["_processing_info"] = {
"preset_used": self.config.preset_name,
"preset_used": preset_name, # Use passed-in parameter
"timestamp_utc": time.strftime("%Y-%m-%dT%H:%M:%SZ", time.gmtime()),
"input_source": str(self.input_path.name), # Add original input source
}
@ -1907,8 +2069,9 @@ class AssetProcessor:
if key in final_metadata and isinstance(final_metadata[key], list): final_metadata[key].sort()
# Use asset name in temporary filename to avoid conflicts
metadata_filename = f"{asset_name}_{self.config.metadata_filename}"
output_path = self.temp_dir / metadata_filename
# Use passed-in parameter for the base metadata filename
temp_metadata_filename = f"{asset_name}_{metadata_filename}"
output_path = self.temp_dir / temp_metadata_filename
log.debug(f"Writing metadata for asset '{asset_name}' to temporary file: {output_path}")
try:
with open(output_path, 'w', encoding='utf-8') as f:
@ -2181,10 +2344,14 @@ class AssetProcessor:
log.warning(f"No files found in input for prediction: {self.input_path.name}")
return None # Return None if no files found
# 3. Lightweight Classification for Stems and Potential Maps
map_type_mapping = self.config.map_type_mapping
model_patterns = self.config.asset_category_rules.get('model_patterns', [])
separator = self.config.source_naming_separator
# 3. Lightweight Classification for Stems and Potential Maps using rules
map_type_mapping_rules = self._get_rule_with_fallback('map_type_mapping', default=[])
model_patterns = self._get_rule_with_fallback('model_file_patterns', default=[]) # Use model_file_patterns from rules/config
# Get naming rules using fallback logic for separator
naming_rules = self._get_rule_with_fallback('source_naming_rules', default={})
separator = naming_rules.get('separator', self.config.source_naming_separator) # Fallback to config if not in rules
processed_filenames = set() # Track full filenames processed
potential_map_files = {} # Store fname -> potential map_type
@ -2195,16 +2362,24 @@ class AssetProcessor:
fstem_lower = fstem.lower()
name_parts = fstem_lower.split(separator)
# Check map rules first
# Check map rules first (using rules/config)
map_matched = False
for mapping_rule in map_type_mapping:
source_keywords, standard_map_type = mapping_rule
if standard_map_type not in self.config.standard_map_types: continue
for mapping_rule in map_type_mapping_rules:
# Ensure the rule has the expected structure (list of keywords, target type)
if not isinstance(mapping_rule, dict) or 'keywords' not in mapping_rule or 'target_type' not in mapping_rule:
log.warning(f"Prediction: Skipping invalid map type mapping rule: {mapping_rule}")
continue
source_keywords = mapping_rule.get('keywords', [])
standard_map_type = mapping_rule.get('target_type')
if standard_map_type not in self.config.standard_map_types: continue # Check against standard types from config
for keyword in source_keywords:
kw_lower = keyword.lower().strip('*')
if kw_lower in name_parts:
is_exact_match = any(part == kw_lower for part in name_parts)
if is_exact_match:
# Check if the keyword exists as a whole part in the split stem
if any(part == kw_lower for part in name_parts):
candidate_stems.add(fstem) # Add unique stem
potential_map_files[fname] = standard_map_type # Store potential type
processed_filenames.add(fname)
@ -2232,12 +2407,12 @@ class AssetProcessor:
base_name_fallback = self.input_path.stem if self.input_path.is_file() else self.input_path.name
determined_base_name = base_name_fallback
else:
# 4. Replicate _determine_base_metadata logic for base name
# 4. Replicate _determine_base_metadata logic for base name using rules
determined_base_name = "UnknownAssetName"
separator = self.config.source_naming_separator
indices_dict = self.config.source_naming_indices
# Use the separator and indices_dict retrieved earlier using fallback logic
base_index_raw = indices_dict.get('base_name')
log.debug(f"[PREDICTION] Base Name Determination: Separator='{separator}', Indices Dict={indices_dict}, Raw Base Index='{base_index_raw}'") # DEBUG PREDICTION
log.debug(f"[PREDICTION] Base Name Determination: Separator='{separator}', Indices Dict={indices_dict}, Raw Base Index='{base_index_raw}' (from rules/config)") # DEBUG PREDICTION
base_index = None
if base_index_raw is not None:
@ -2276,13 +2451,13 @@ class AssetProcessor:
log.debug(f"[PREDICTION] Final determined base name for prediction: '{final_base_name}'") # DEBUG PREDICTION
final_supplier_name = self._sanitize_filename(supplier_name)
# 6. Predict Output Filenames
file_predictions = {}
target_pattern = self.config.target_filename_pattern
# Use highest resolution key as a placeholder for prediction
highest_res_key = "Res?" # Fallback
if self.config.image_resolutions:
highest_res_key = max(self.config.image_resolutions, key=self.config.image_resolutions.get)
# 6. Predict Output Filenames using rules
target_pattern = self._get_rule_with_fallback('target_filename_pattern', default=self.config.target_filename_pattern)
# Get resolutions using fallback
resolutions = self._get_rule_with_fallback('image_resolutions', default=self.config.image_resolutions)
highest_res_key = "Res?" # Placeholder resolution for prediction
if resolutions:
highest_res_key = max(resolutions, key=resolutions.get)
for input_fname, map_type in potential_map_files.items():
# Assume PNG for prediction, extension might change based on bit depth rules later
@ -2311,7 +2486,7 @@ class AssetProcessor:
# --- New Detailed Prediction Method ---
def get_detailed_file_predictions(self) -> list[dict] | None:
def get_detailed_file_predictions(self, rules: SourceRule) -> list[dict] | None:
"""
Performs extraction and classification to provide a detailed list of all
files found within the input and their predicted status/output name,
@ -2332,14 +2507,16 @@ class AssetProcessor:
try:
# --- Perform necessary setup and classification ---
self._setup_workspace()
self._extract_input()
# Run classification - this populates self.classified_files
self._inventory_and_classify_files()
# Run classification - this populates self.classified_files, pass rules
self._inventory_and_classify_files(rules)
# --- Determine distinct assets and file mapping ---
# This uses the results from _inventory_and_classify_files
distinct_base_names, file_to_base_name_map = self._determine_base_metadata()
# This uses the results from _inventory_and_classify_files, pass rules
distinct_base_names, file_to_base_name_map = self._determine_base_metadata(rules)
log.debug(f"Prediction: Determined base names: {distinct_base_names}")
log.debug(f"Prediction: File to base name map: { {str(k):v for k,v in file_to_base_name_map.items()} }")
@ -2373,11 +2550,14 @@ class AssetProcessor:
# Note: This is a temporary list for prediction generation, not modifying the instance's classified_files permanently
# self.classified_files["maps"] = maps_with_predicted_types # Avoid modifying instance state
# --- Prepare for filename prediction ---
target_pattern = self.config.target_filename_pattern
# --- Prepare for filename prediction using rules ---
target_pattern = self._get_rule_with_fallback(rules, 'target_filename_pattern', default=self.config.target_filename_pattern)
# Get resolutions using fallback
resolutions = self._get_rule_with_fallback(rules, 'image_resolutions', default=self.config.image_resolutions)
highest_res_key = "Res?" # Placeholder resolution for prediction
if self.config.image_resolutions:
highest_res_key = max(self.config.image_resolutions, key=self.config.image_resolutions.get)
if resolutions:
highest_res_key = max(resolutions, key=resolutions.get)
# --- Process all classified files (including maps with predicted types) ---
all_classified_files_with_category = []

View File

@ -10,18 +10,23 @@ log = logging.getLogger(__name__)
log.info(f"sys.path: {sys.path}")
from PySide6.QtWidgets import (
QApplication, QMainWindow, QWidget, QVBoxLayout, QHBoxLayout, QSplitter, # Added QSplitter
QApplication, QMainWindow, QWidget, QVBoxLayout, QHBoxLayout, QSplitter, QTableView, # Added QSplitter, QTableView
QPushButton, QComboBox, QTableWidget, QTableWidgetItem, QHeaderView,
QProgressBar, QLabel, QFrame, QCheckBox, QSpinBox, QListWidget, QTextEdit, # Added QListWidget, QTextEdit
QLineEdit, QMessageBox, QFileDialog, QInputDialog, QListWidgetItem, QTabWidget, # Added more widgets
QFormLayout, QGroupBox, QAbstractItemView, QSizePolicy, # Added more layout/widget items
QMenuBar, QMenu # Added for menu
)
from PySide6.QtCore import Qt, QThread, Slot, Signal, QObject # Added Signal, QObject
from PySide6.QtCore import Qt, QThread, Slot, Signal, QObject, QModelIndex # Added Signal, QObject, QModelIndex
from PySide6.QtGui import QColor, QAction, QPalette # Add QColor import, QAction, QPalette
# --- Backend Imports for Data Structures ---
from rule_structure import SourceRule, AssetRule, FileRule # Import Rule Structures
from gui.rule_editor_widget import RuleEditorWidget # Import the new rule editor widget
# --- GUI Model Imports ---
from gui.preview_table_model import PreviewTableModel, PreviewSortFilterProxyModel
from gui.rule_hierarchy_model import RuleHierarchyModel # Import the new hierarchy model
# --- Backend Imports ---
script_dir = Path(__file__).parent
@ -150,6 +155,8 @@ def setup_table_widget_with_controls(parent_layout, label_text, attribute_name,
class MainWindow(QMainWindow):
# Signal emitted when presets change in the editor panel
presets_changed_signal = Signal()
# Signal to trigger prediction handler in its thread
start_prediction_signal = Signal(list, str, object) # input_paths, preset_name, rules
def __init__(self):
super().__init__()
@ -159,6 +166,8 @@ class MainWindow(QMainWindow):
# --- Internal State ---
self.current_asset_paths = set() # Store unique paths of assets added
self.rule_hierarchy_model = RuleHierarchyModel() # Instantiate the hierarchy model
self._current_source_rule = None # Store the current SourceRule object
# --- Editor State ---
self.current_editing_preset_path = None
@ -172,6 +181,27 @@ class MainWindow(QMainWindow):
self.prediction_handler = None
self.setup_threads()
# --- Preview Area (Table) Setup ---
# Initialize models
self.preview_model = PreviewTableModel()
self.preview_proxy_model = PreviewSortFilterProxyModel()
self.preview_proxy_model.setSourceModel(self.preview_model)
# Initialize table view and placeholder
self.preview_table_view = QTableView()
self.preview_table_view.setModel(self.preview_proxy_model)
self.preview_placeholder_label = QLabel("Please select a preset to view file predictions")
self.preview_placeholder_label.setAlignment(Qt.AlignmentFlag.AlignCenter)
self.preview_placeholder_label.setStyleSheet("QLabel { font-size: 16px; color: grey; }")
# Initially hide the table view and show the placeholder
self.preview_table_view.setVisible(False)
self.preview_placeholder_label.setVisible(True)
# Apply style sheet to remove borders and rounded corners
self.preview_table_view.setStyleSheet("QTableView { border: none; }")
# --- Main Layout with Splitter ---
self.splitter = QSplitter(Qt.Orientation.Horizontal)
self.setCentralWidget(self.splitter)
@ -361,7 +391,6 @@ class MainWindow(QMainWindow):
self.output_path_edit.setText("") # Clear on error
self.statusBar().showMessage(f"Error setting default output path: {e}", 5000)
# --- Drag and Drop Area ---
self.drag_drop_area = QFrame()
self.drag_drop_area.setFrameShape(QFrame.Shape.StyledPanel)
@ -375,27 +404,31 @@ class MainWindow(QMainWindow):
main_layout.addWidget(self.drag_drop_area)
self.drag_drop_area.setVisible(False) # Hide the specific visual drag/drop area
# --- Preview Area (Table) ---
self.preview_label = QLabel("File Preview:") # Updated Label
self.preview_table = QTableWidget() # Keep QTableWidget for now, will replace with QTableView later
# Initialize models
self.preview_model = PreviewTableModel()
self.preview_proxy_model = PreviewSortFilterProxyModel()
self.preview_proxy_model.setSourceModel(self.preview_model)
# --- Hierarchy and Rule Editor Splitter ---
self.hierarchy_rule_splitter = QSplitter(Qt.Orientation.Vertical)
main_layout.addWidget(self.hierarchy_rule_splitter, 1) # Give it stretch factor
# Use the proxy model for the table view
# NOTE: QTableWidget is simpler but less flexible with models.
# For full model/view benefits (like multi-column sorting via proxy),
# we should ideally switch to QTableView. Sticking with QTableWidget for minimal change first.
# However, QTableWidget doesn't fully support QSortFilterProxyModel for sorting.
# Let's switch to QTableView now for proper model/proxy integration.
from PySide6.QtWidgets import QTableView # Import QTableView
# --- Hierarchy Tree View ---
from PySide6.QtWidgets import QTreeView # Import QTreeView
self.hierarchy_tree_view = QTreeView()
self.hierarchy_tree_view.setHeaderHidden(True) # Hide header for simple hierarchy display
self.hierarchy_tree_view.setEditTriggers(QAbstractItemView.EditTrigger.NoEditTriggers) # Make items non-editable
self.hierarchy_tree_view.setModel(self.rule_hierarchy_model) # Set the hierarchy model
self.hierarchy_tree_view.clicked.connect(self._on_hierarchy_item_clicked) # Connect click signal
self.hierarchy_rule_splitter.addWidget(self.hierarchy_tree_view)
self.preview_table_view = QTableView() # Use QTableView instead of QTableWidget
self.preview_table_view.setModel(self.preview_proxy_model) # Set the proxy model
# --- Rule Editor Widget ---
self.rule_editor_widget = RuleEditorWidget()
self.rule_editor_widget.rule_updated.connect(self._on_rule_updated) # Connect rule updated signal
self.hierarchy_rule_splitter.addWidget(self.rule_editor_widget)
# Set initial sizes for the splitter
self.hierarchy_rule_splitter.setSizes([200, 400]) # Adjust sizes as needed
# --- Preview Area (Table) - Moved into the splitter ---
# The preview table view will now be used to display files for the selected asset/source
# Set headers and resize modes using the model's headerData
# The model defines the columns and headers
header = self.preview_table_view.horizontalHeader()
# Set resize modes for detailed columns
header.setSectionResizeMode(self.preview_model.COL_STATUS, QHeaderView.ResizeMode.ResizeToContents)
@ -436,24 +469,29 @@ class MainWindow(QMainWindow):
header.moveSection(header.visualIndex(self.preview_model.COL_ADDITIONAL_FILES), 4)
# Current visual: [0, 1, 4, 2, 5, 3] - This looks correct.
main_layout.addWidget(self.preview_label)
# Add placeholder label for the preview area (already done, just referencing)
# self.preview_placeholder_label = QLabel("Please select a preset to view file predictions") # Already initialized in __init__
# self.preview_placeholder_label.setAlignment(Qt.AlignmentFlag.AlignCenter) # Already done
# self.preview_placeholder_label.setStyleSheet("QLabel { font-size: 16px; color: grey; }") # Optional styling # Already done
# Add placeholder label for the preview area
self.preview_placeholder_label = QLabel("Please select a preset to view file predictions")
self.preview_placeholder_label.setAlignment(Qt.AlignmentFlag.AlignCenter)
self.preview_placeholder_label.setStyleSheet("QLabel { font-size: 16px; color: grey; }") # Optional styling
# Add both the table view and the placeholder label to the layout (already done, just referencing)
# We will manage their visibility later (already done, just referencing)
# main_layout.addWidget(self.preview_placeholder_label, 1) # Give it stretch factor # REMOVED - Now managed by splitter
# main_layout.addWidget(self.preview_table_view, 1) # Give it stretch factor # REMOVED - Now managed by splitter
# Add both the table view and the placeholder label to the layout
# We will manage their visibility later
main_layout.addWidget(self.preview_placeholder_label, 1) # Give it stretch factor
main_layout.addWidget(self.preview_table_view, 1) # Give it stretch factor
# Initially hide the table view and show the placeholder (already done, just referencing)
# self.preview_table_view.setVisible(False) # Already done
# self.preview_placeholder_label.setVisible(True) # Already done
# Initially hide the table view and show the placeholder
self.preview_table_view.setVisible(False)
self.preview_placeholder_label.setVisible(True)
# Apply style sheet to remove borders and rounded corners (already done, just referencing)
# self.preview_table_view.setStyleSheet("QTableView { border: none; }") # Already done
# Apply style sheet to remove borders and rounded corners
self.preview_table_view.setStyleSheet("QTableView { border: none; }")
# --- Add Preview Table View to Splitter ---
# The preview table view will now be placed below the hierarchy tree view in the splitter
# It will display the files associated with the selected item in the hierarchy
self.hierarchy_rule_splitter.addWidget(self.preview_table_view)
# Set initial sizes for the splitter (adjusting to include the table view)
self.hierarchy_rule_splitter.setSizes([200, 200, 400]) # Hierarchy, Rule Editor, File Preview
# --- Progress Bar ---
self.progress_bar = QProgressBar()
@ -751,10 +789,22 @@ class MainWindow(QMainWindow):
if self.processing_thread and self.processing_handler:
try: self.processing_thread.started.disconnect()
except RuntimeError: pass
# Use the current SourceRule from the hierarchy model
if self._current_source_rule is None:
log.error("Cannot start processing: No rule hierarchy available.")
self.statusBar().showMessage("Error: No rule hierarchy available. Run preview first.", 5000)
self.set_controls_enabled(True)
self.cancel_button.setEnabled(False)
self.start_button.setText("Start Processing")
return
log.debug(f"Using SourceRule '{self._current_source_rule.name}' for processing.")
self.processing_thread.started.connect(
lambda: self.processing_handler.run_processing(
input_paths, selected_preset, output_dir_str, overwrite, num_workers,
# Pass Blender integration settings
rules=self._current_source_rule, # Pass the current SourceRule
run_blender=self.blender_integration_checkbox.isChecked(),
nodegroup_blend_path=self.nodegroup_blend_path_input.text(),
materials_blend_path=self.materials_blend_path_input.text(),
@ -969,17 +1019,25 @@ class MainWindow(QMainWindow):
log.info(f"[{time.time():.4f}] Requesting background preview update for {len(input_paths)} items, Preset='{selected_preset}'")
self.statusBar().showMessage(f"Updating preview for '{selected_preset}'...", 0)
# Clearing is handled by model's set_data now, no need to clear table view directly
self.setup_threads() # Ensure threads are ready
if self.prediction_thread and self.prediction_handler:
try: self.prediction_thread.started.disconnect() # Disconnect previous lambda if any
except RuntimeError: pass
# Connect the lambda to start the prediction
self.prediction_thread.started.connect(
lambda: self.prediction_handler.run_prediction(input_paths, selected_preset)
)
# Create a placeholder SourceRule instance (replace with actual rule loading later)
placeholder_rules = SourceRule() # Temporary rule for passing data
log.debug(f"Created placeholder SourceRule for prediction.")
# Create a placeholder SourceRule instance (replace with actual rule loading later)
placeholder_rules = SourceRule() # Temporary rule for passing data
log.debug(f"Created placeholder SourceRule for prediction.")
# Start the prediction thread
log.debug(f"[{time.time():.4f}] Starting prediction thread...")
self.prediction_thread.start()
log.debug(f"[{time.time():.4f}] Prediction thread start requested.")
# Emit the signal to trigger run_prediction in the prediction thread
log.debug(f"[{time.time():.4f}] Emitting start_prediction_signal...")
self.start_prediction_signal.emit(input_paths, selected_preset, placeholder_rules)
log.debug(f"[{time.time():.4f}] start_prediction_signal emitted.")
else:
log.error(f"[{time.time():.4f}][T:{thread_id}] Failed to start prediction: Thread or handler not initialized.")
self.statusBar().showMessage("Error: Failed to initialize prediction thread.", 5000)
@ -1013,7 +1071,10 @@ class MainWindow(QMainWindow):
self.prediction_thread = QThread(self)
self.prediction_handler = PredictionHandler()
self.prediction_handler.moveToThread(self.prediction_thread)
self.prediction_handler.prediction_results_ready.connect(self.on_prediction_results_ready) # Updated slot below
# Connect the new signal to the handler's run_prediction slot using QueuedConnection
self.start_prediction_signal.connect(self.prediction_handler.run_prediction, Qt.ConnectionType.QueuedConnection)
self.prediction_handler.prediction_results_ready.connect(self.on_prediction_results_ready) # Connect the file list signal
self.prediction_handler.rule_hierarchy_ready.connect(self._on_rule_hierarchy_ready) # Connect the new hierarchy signal
self.prediction_handler.prediction_finished.connect(self.on_prediction_finished)
self.prediction_handler.status_message.connect(self.show_status_message)
self.prediction_handler.prediction_finished.connect(self.prediction_thread.quit)
@ -1613,6 +1674,84 @@ class MainWindow(QMainWindow):
else:
event.accept() # Accept close event
# --- Slots for Hierarchy and Rule Editor ---
@Slot(QModelIndex)
def _on_hierarchy_item_clicked(self, index: QModelIndex):
"""Loads the selected rule item into the rule editor and filters the preview table."""
if index.isValid():
rule_item = self.rule_hierarchy_model.get_item_from_index(index)
if rule_item:
rule_type_name = type(rule_item).__name__
log.debug(f"Hierarchy item clicked: {rule_type_name} - {getattr(rule_item, 'name', 'N/A')}")
self.rule_editor_widget.load_rule(rule_item, rule_type_name)
# Filter the preview table based on the selected item
if isinstance(rule_item, SourceRule):
# Show all files for the source
self.preview_proxy_model.setFilterRegularExpression("") # Clear filter
self.preview_proxy_model.setFilterKeyColumn(-1) # Apply to all columns (effectively no column filter)
log.debug("Filtering preview table: Showing all files for Source.")
elif isinstance(rule_item, AssetRule):
# Show files belonging to this asset
# Filter by the 'source_asset' column (which stores the asset name/path)
# Need to escape potential regex special characters in the asset name/path
filter_string = "^" + rule_item.asset_name.replace('\\', '\\\\').replace('.', '\\.') + "$"
self.preview_proxy_model.setFilterRegularExpression(filter_string)
self.preview_proxy_model.setFilterKeyColumn(self.preview_model.ROLE_SOURCE_ASSET) # Filter by source_asset column
log.debug(f"Filtering preview table: Showing files for Asset '{rule_item.name}'. Filter: '{filter_string}' on column {self.preview_model.COL_SOURCE_ASSET}")
elif isinstance(rule_item, FileRule):
# Show only this specific file
# Filter by the 'original_path' column
filter_string = "^" + rule_item.file_path.replace('\\', '\\\\').replace('.', '\\.') + "$"
self.preview_proxy_model.setFilterRegularExpression(filter_string)
self.preview_proxy_model.setFilterKeyColumn(self.preview_model.COL_ORIGINAL_PATH) # Filter by original_path column
log.debug(f"Filtering preview table: Showing file '{rule_item.file_path}'. Filter: '{filter_string}' on column {self.preview_model.COL_ORIGINAL_PATH}")
else:
# Clear filter for unknown types
self.preview_proxy_model.setFilterRegularExpression("")
self.preview_proxy_model.setFilterKeyColumn(-1)
log.warning(f"Clicked item has unknown type {type(rule_item)}. Clearing preview filter.")
else:
log.warning("Clicked item has no associated rule object. Clearing editor and preview filter.")
self.rule_editor_widget.clear_editor()
self.preview_proxy_model.setFilterRegularExpression("") # Clear filter
self.preview_proxy_model.setFilterKeyColumn(-1)
else:
log.debug("Clicked item index is invalid. Clearing editor and preview filter.")
self.rule_editor_widget.clear_editor()
self.preview_proxy_model.setFilterRegularExpression("") # Clear filter
self.preview_proxy_model.setFilterKeyColumn(-1)
@Slot(object)
def _on_rule_updated(self, rule_object):
"""Handles the signal when a rule is updated in the editor."""
# This slot is called when an attribute is changed in the RuleEditorWidget.
# The rule_object passed is the actual object instance from the hierarchy model.
# Since the RuleEditorWidget modifies the object in place, we don't need to
# explicitly update the model's data structure here.
# However, if the change affects the display in the hierarchy tree or preview table,
# we might need to emit dataChanged signals or trigger updates.
# For now, just log the update.
log.debug(f"Rule object updated in editor: {type(rule_object).__name__} - {getattr(rule_object, 'name', 'N/A')}")
# TODO: Consider if any UI updates are needed based on the rule change.
# E.g., if a rule name changes, the hierarchy tree might need a dataChanged signal.
# If a rule affects file output names, the preview table might need updating.
# This is complex and depends on which rule attributes are editable and their impact.
@Slot(object)
def _on_rule_hierarchy_ready(self, source_rule: SourceRule):
"""Receives the generated SourceRule hierarchy and updates the tree view model."""
log.info(f"Received rule hierarchy ready signal for input: {source_rule.input_path}")
self._current_source_rule = source_rule # Store the generated rule hierarchy
self.rule_hierarchy_model.set_root_rule(source_rule) # Update the tree view model
self.hierarchy_tree_view.expandToDepth(0) # Expand the first level (Source and Assets)
log.debug("Rule hierarchy model updated and tree view expanded.")
# --- Main Execution ---
def run_gui():
"""Initializes and runs the Qt application."""

View File

@ -1,12 +1,14 @@
from rule_structure import SourceRule, AssetRule, FileRule
# gui/prediction_handler.py
import logging
from pathlib import Path
import time # For potential delays if needed
import os # For cpu_count
from concurrent.futures import ThreadPoolExecutor, as_completed # For parallel prediction
from collections import defaultdict
# --- PySide6 Imports ---
from PySide6.QtCore import QObject, Signal, QThread # Import QThread
from PySide6.QtCore import QObject, Signal, QThread, Slot # Import QThread and Slot
# --- Backend Imports ---
# Adjust path to ensure modules can be found relative to this file's location
@ -43,6 +45,8 @@ class PredictionHandler(QObject):
# Emits a list of dictionaries, each representing a file row for the table
# Dict format: {'original_path': str, 'predicted_asset_name': str | None, 'predicted_output_name': str | None, 'status': str, 'details': str | None, 'source_asset': str}
prediction_results_ready = Signal(list)
# Emitted when the hierarchical rule structure is ready
rule_hierarchy_ready = Signal(object) # Emits a SourceRule object
# Emitted when all predictions for a batch are done
prediction_finished = Signal()
# Emitted for status updates
@ -57,78 +61,72 @@ class PredictionHandler(QObject):
def is_running(self):
return self._is_running
def _predict_single_asset(self, input_path_str: str, config: Configuration) -> list[dict]:
def _predict_single_asset(self, input_path_str: str, config: Configuration, rules: SourceRule) -> list[dict] | dict:
"""
Helper method to predict a single asset. Runs within the ThreadPoolExecutor.
Returns a list of prediction dictionaries for the asset, or a single error dict.
Helper method to run detailed file prediction for a single input path.
Runs within the ThreadPoolExecutor.
Returns a list of file prediction dictionaries for the input, or a dictionary representing an error.
"""
input_path = Path(input_path_str)
source_asset_name = input_path.name # For reference in the results
asset_results = []
source_asset_name = input_path.name # For reference in error reporting
try:
# Create AssetProcessor instance (needs dummy output path)
# Ensure AssetProcessor is thread-safe or create a new instance per thread.
# Based on its structure (using temp dirs), creating new instances should be safe.
# Create AssetProcessor instance (needs dummy output path for prediction)
# The detailed prediction method handles its own workspace setup/cleanup
processor = AssetProcessor(input_path, config, Path(".")) # Dummy output path
# Get detailed file predictions
detailed_predictions = processor.get_detailed_file_predictions()
# Get the detailed file predictions
# This method returns a list of dictionaries
detailed_predictions = processor.get_detailed_file_predictions(rules)
if detailed_predictions is None:
log.error(f"Detailed prediction failed critically for {input_path_str}. Adding asset-level error.")
# Add a single error entry for the whole asset if the method returns None
asset_results.append({
'original_path': source_asset_name, # Use asset name as placeholder
'predicted_asset_name': None, # New key
'predicted_output_name': None, # New key
log.error(f"AssetProcessor.get_detailed_file_predictions returned None for {input_path_str}.")
# Return a list containing a single error entry for consistency
return [{
'original_path': source_asset_name,
'predicted_asset_name': None,
'predicted_output_name': None,
'status': 'Error',
'details': 'Critical prediction failure (check logs)',
'details': 'Prediction returned no results',
'source_asset': source_asset_name
})
else:
log.debug(f"Received {len(detailed_predictions)} detailed predictions for {input_path_str}.")
# Add source_asset key and ensure correct keys exist
for prediction_dict in detailed_predictions:
# Ensure all expected keys are present, even if None
result_entry = {
'original_path': prediction_dict.get('original_path', '[Missing Path]'),
'predicted_asset_name': prediction_dict.get('predicted_asset_name'), # New key
'predicted_output_name': prediction_dict.get('predicted_output_name'), # New key
'status': prediction_dict.get('status', 'Error'),
'details': prediction_dict.get('details', '[Missing Details]'),
'source_asset': source_asset_name # Add the source asset identifier
}
asset_results.append(result_entry)
}]
except AssetProcessingError as e: # Catch errors during processor instantiation or prediction setup
log.error(f"Asset processing error during prediction setup for {input_path_str}: {e}")
asset_results.append({
# Add the source_asset name to each prediction result for grouping later
for prediction in detailed_predictions:
prediction['source_asset'] = source_asset_name
log.debug(f"Generated {len(detailed_predictions)} detailed predictions for {input_path_str}.")
return detailed_predictions # Return the list of dictionaries
except AssetProcessingError as e:
log.error(f"Asset processing error during prediction for {input_path_str}: {e}")
# Return a list containing a single error entry for consistency
return [{
'original_path': source_asset_name,
'predicted_asset_name': None,
'predicted_output_name': None,
'status': 'Error',
'details': f'Asset Error: {e}',
'source_asset': source_asset_name
})
except Exception as e: # Catch unexpected errors
}]
except Exception as e:
log.exception(f"Unexpected error during prediction for {input_path_str}: {e}")
asset_results.append({
# Return a list containing a single error entry for consistency
return [{
'original_path': source_asset_name,
'predicted_asset_name': None,
'predicted_output_name': None,
'status': 'Error',
'details': f'Unexpected Error: {e}',
'source_asset': source_asset_name
})
finally:
# Cleanup for the single asset prediction if needed (AssetProcessor handles its own temp dir)
pass
return asset_results
}]
def run_prediction(self, input_paths: list[str], preset_name: str):
@Slot()
def run_prediction(self, input_paths: list[str], preset_name: str, rules: SourceRule):
"""
Runs the prediction logic for the given paths and preset using a ThreadPoolExecutor.
Generates the hierarchical rule structure and detailed file predictions.
This method is intended to be run in a separate QThread.
"""
if self._is_running:
@ -156,7 +154,6 @@ class PredictionHandler(QObject):
except ConfigurationError as e:
log.error(f"Failed to load configuration for preset '{preset_name}': {e}")
self.status_message.emit(f"Error loading preset '{preset_name}': {e}", 5000)
# Emit error for all items? Or just finish? Finish for now.
self.prediction_finished.emit()
self._is_running = False
return
@ -166,9 +163,15 @@ class PredictionHandler(QObject):
self.prediction_finished.emit()
return
all_file_results = [] # Accumulate results here
# Create the root SourceRule object
# For now, use a generic name. Later, this might be derived from input paths.
source_rule = SourceRule()
log.debug(f"Created root SourceRule object.")
# Collect all detailed file prediction results from completed futures
all_file_prediction_results = []
futures = []
# Determine number of workers - use half the cores, minimum 1, max 8?
max_workers = min(max(1, (os.cpu_count() or 1) // 2), 8)
log.info(f"Using ThreadPoolExecutor with max_workers={max_workers} for prediction.")
@ -176,22 +179,34 @@ class PredictionHandler(QObject):
with ThreadPoolExecutor(max_workers=max_workers) as executor:
# Submit tasks for each input path
for input_path_str in input_paths:
future = executor.submit(self._predict_single_asset, input_path_str, config)
# _predict_single_asset now returns a list of file prediction dicts or an error dict list
future = executor.submit(self._predict_single_asset, input_path_str, config, rules)
futures.append(future)
# Process results as they complete
for future in as_completed(futures):
try:
# Result is a list of dicts for one asset
asset_result_list = future.result()
if asset_result_list: # Check if list is not empty
all_file_results.extend(asset_result_list)
result = future.result()
if isinstance(result, list):
# Extend the main list with results from this asset
all_file_prediction_results.extend(result)
elif isinstance(result, dict) and result.get('status') == 'Error':
# Handle error dictionaries returned by _predict_single_asset (should be in a list now, but handle single dict for safety)
all_file_prediction_results.append(result)
else:
log.error(f'Prediction task returned unexpected result type: {type(result)}')
all_file_prediction_results.append({
'original_path': '[Unknown Asset - Unexpected Result]',
'predicted_asset_name': None,
'predicted_output_name': None,
'status': 'Error',
'details': f'Unexpected result type: {type(result)}',
'source_asset': '[Unknown]'
})
except Exception as exc:
# This catches errors within the future execution itself if not handled by _predict_single_asset
log.error(f'Prediction task generated an exception: {exc}', exc_info=True)
# We might not know which input path failed here easily without more mapping
# Add a generic error?
all_file_results.append({
all_file_prediction_results.append({
'original_path': '[Unknown Asset - Executor Error]',
'predicted_asset_name': None,
'predicted_output_name': None,
@ -203,8 +218,7 @@ class PredictionHandler(QObject):
except Exception as pool_exc:
log.exception(f"An error occurred with the prediction ThreadPoolExecutor: {pool_exc}")
self.status_message.emit(f"Error during prediction setup: {pool_exc}", 5000)
# Add a generic error if the pool fails
all_file_results.append({
all_file_prediction_results.append({
'original_path': '[Prediction Pool Error]',
'predicted_asset_name': None,
'predicted_output_name': None,
@ -213,19 +227,61 @@ class PredictionHandler(QObject):
'source_asset': '[System]'
})
# Emit the combined list of detailed file results at the end
# Note: thread_id was already defined earlier in this function
log.info(f"[{time.time():.4f}][T:{thread_id}] Parallel prediction run finished. Preparing to emit {len(all_file_results)} file results.")
# <<< Add logging before emit >>>
log.debug(f"[{time.time():.4f}][T:{thread_id}] Type of all_file_results before emit: {type(all_file_results)}")
# --- Build the hierarchical rule structure (SourceRule -> AssetRule -> FileRule) ---
# Group file prediction results by predicted_asset_name
grouped_by_asset = defaultdict(list)
for file_pred in all_file_prediction_results:
# Group by predicted_asset_name, handle None or errors
asset_name = file_pred.get('predicted_asset_name')
if asset_name is None:
# Group files without a predicted asset name under a special key or ignore for hierarchy?
# Let's group them under their source_asset name for now, but mark them clearly.
asset_name = f"[{file_pred.get('source_asset', 'UnknownSource')}]" # Use source asset name as a fallback identifier
log.debug(f"File '{file_pred.get('original_path', 'UnknownPath')}' has no predicted asset name, grouping under '{asset_name}' for hierarchy.")
grouped_by_asset[asset_name].append(file_pred)
# Create AssetRule objects from the grouped results
asset_rules = []
for asset_name, file_preds in grouped_by_asset.items():
# Determine the source_path for the AssetRule (use the source_asset from the first file in the group)
source_asset_path = file_preds[0].get('source_asset', asset_name) # Fallback to asset_name if source_asset is missing
asset_rule = AssetRule(asset_name=asset_name)
# Create FileRule objects from the file prediction dictionaries
for file_pred in file_preds:
file_rule = FileRule(
file_path=file_pred.get('original_path', 'UnknownPath'),
map_type_override=None, # Assuming these are not predicted here
resolution_override=None, # Assuming these are not predicted here
channel_merge_instructions={}, # Assuming these are not predicted here
output_format_override=None # Assuming these are not predicted here
)
asset_rule.files.append(file_rule)
asset_rules.append(asset_rule)
# Populate the SourceRule with the collected AssetRules
source_rule.assets = asset_rules
log.debug(f"Built SourceRule with {len(asset_rules)} AssetRule(s).")
# Emit the hierarchical rule structure
log.info(f"[{time.time():.4f}][T:{thread_id}] Parallel prediction run finished. Preparing to emit rule hierarchy.")
self.rule_hierarchy_ready.emit(source_rule)
log.info(f"[{time.time():.4f}][T:{thread_id}] Emitted rule_hierarchy_ready signal.")
# Emit the combined list of detailed file results for the table view
log.info(f"[{time.time():.4f}][T:{thread_id}] Preparing to emit {len(all_file_prediction_results)} file results for table view.")
log.debug(f"[{time.time():.4f}][T:{thread_id}] Type of all_file_prediction_results before emit: {type(all_file_prediction_results)}")
try:
log.debug(f"[{time.time():.4f}][T:{thread_id}] Content of all_file_results (first 5) before emit: {all_file_results[:5]}")
log.debug(f"[{time.time():.4f}][T:{thread_id}] Content of all_file_prediction_results (first 5) before emit: {all_file_prediction_results[:5]}")
except Exception as e:
log.error(f"[{time.time():.4f}][T:{thread_id}] Error logging all_file_results content: {e}")
# <<< End added logging >>>
log.error(f"[{time.time():.4f}][T:{thread_id}] Error logging all_file_prediction_results content: {e}")
log.info(f"[{time.time():.4f}][T:{thread_id}] Emitting prediction_results_ready signal...")
self.prediction_results_ready.emit(all_file_results)
self.prediction_results_ready.emit(all_file_prediction_results)
log.info(f"[{time.time():.4f}][T:{thread_id}] Emitted prediction_results_ready signal.")
self.status_message.emit("Preview update complete.", 3000)
self.prediction_finished.emit()
self._is_running = False

View File

@ -7,6 +7,7 @@ import time # For potential delays if needed
import subprocess # <<< ADDED IMPORT
import shutil # <<< ADDED IMPORT
from typing import Optional # <<< ADDED IMPORT
from rule_structure import SourceRule # Import SourceRule
# --- PySide6 Imports ---
# Inherit from QObject to support signals/slots for thread communication
@ -70,7 +71,7 @@ class ProcessingHandler(QObject):
return self._is_running
def run_processing(self, input_paths: list[str], preset_name: str, output_dir_str: str, overwrite: bool, num_workers: int,
run_blender: bool, nodegroup_blend_path: str, materials_blend_path: str, verbose: bool): # <<< ADDED verbose PARAM
run_blender: bool, nodegroup_blend_path: str, materials_blend_path: str, verbose: bool, rules: SourceRule): # <<< ADDED verbose PARAM
"""
Starts the asset processing task and optionally runs Blender scripts afterwards.
This method should be called when the handler is moved to a separate thread.
@ -107,7 +108,7 @@ class ProcessingHandler(QObject):
for input_path in input_paths:
if self._cancel_requested: break # Check before submitting more
log.debug(f"Submitting task for: {input_path}")
future = executor.submit(process_single_asset_wrapper, input_path, preset_name, output_dir_str, overwrite, verbose=verbose) # Pass verbose flag from GUI
future = executor.submit(process_single_asset_wrapper, input_path, preset_name, output_dir_str, overwrite, verbose=verbose, rules=rules) # Pass verbose flag from GUI and rules
self._futures[future] = input_path # Map future back to input path
# Optionally emit "processing" status here
self.file_status_updated.emit(input_path, "processing", "")

180
gui/rule_editor_widget.py Normal file
View File

@ -0,0 +1,180 @@
import sys
from PySide6.QtWidgets import (QApplication, QWidget, QVBoxLayout, QLabel, QLineEdit,
QFormLayout, QComboBox, QCheckBox, QSpinBox, QDoubleSpinBox)
from PySide6.QtCore import Signal, Slot, QObject
# Assuming rule_structure.py is in the parent directory or accessible via PYTHONPATH
# from ..rule_structure import SourceRule, AssetRule, FileRule # Adjust import based on actual structure
# For now, we'll use placeholder classes or assume rule_structure is directly importable
# from rule_structure import SourceRule, AssetRule, FileRule # Assuming direct import is possible
class RuleEditorWidget(QWidget):
"""
A widget to display and edit hierarchical processing rules (Source, Asset, File).
"""
rule_updated = Signal(object) # Signal emitted when a rule is updated
def __init__(self, parent=None):
super().__init__(parent)
self.current_rule_type = None
self.current_rule_object = None
self.layout = QVBoxLayout(self)
self.rule_type_label = QLabel("Select an item in the hierarchy to view/edit rules.")
self.layout.addWidget(self.rule_type_label)
self.form_layout = QFormLayout()
self.layout.addLayout(self.form_layout)
self.layout.addStretch() # Add stretch to push content to the top
self.setLayout(self.layout)
self.clear_editor()
@Slot(object, str)
def load_rule(self, rule_object, rule_type_name):
"""
Loads a rule object into the editor.
Args:
rule_object: The SourceRule, AssetRule, or FileRule object.
rule_type_name: The name of the rule type ('SourceRule', 'AssetRule', 'FileRule').
"""
self.clear_editor()
self.current_rule_object = rule_object
self.current_rule_type = rule_type_name
self.rule_type_label.setText(f"Editing: {rule_type_name}")
if rule_object:
# Dynamically create form fields based on rule object attributes
for attr_name, attr_value in vars(rule_object).items():
if attr_name.startswith('_'): # Skip private attributes
continue
label = QLabel(attr_name.replace('_', ' ').title() + ":")
editor_widget = self._create_editor_widget(attr_name, attr_value)
if editor_widget:
self.form_layout.addRow(label, editor_widget)
# Connect signal to update rule object
self._connect_editor_signal(editor_widget, attr_name)
def _create_editor_widget(self, attr_name, attr_value):
"""
Creates an appropriate editor widget based on the attribute type.
"""
if isinstance(attr_value, bool):
widget = QCheckBox()
widget.setChecked(attr_value)
return widget
elif isinstance(attr_value, int):
widget = QSpinBox()
widget.setRange(-2147483648, 2147483647) # Default integer range
widget.setValue(attr_value)
return widget
elif isinstance(attr_value, float):
widget = QDoubleSpinBox()
widget.setRange(-sys.float_info.max, sys.float_info.max) # Default float range
widget.setValue(attr_value)
return widget
elif isinstance(attr_value, (str, type(None))): # Handle None for strings
widget = QLineEdit()
widget.setText(str(attr_value) if attr_value is not None else "")
return widget
# Add more types as needed (e.g., dropdowns for enums/choices)
# elif isinstance(attr_value, list):
# # Example for a simple list of strings
# widget = QLineEdit()
# widget.setText(", ".join(map(str, attr_value)))
# return widget
else:
# For unsupported types, just display the value
label = QLabel(str(attr_value))
return label
def _connect_editor_signal(self, editor_widget, attr_name):
"""
Connects the appropriate signal of the editor widget to the update logic.
"""
if isinstance(editor_widget, QLineEdit):
editor_widget.textChanged.connect(lambda text: self._update_rule_attribute(attr_name, text))
elif isinstance(editor_widget, QCheckBox):
editor_widget.toggled.connect(lambda checked: self._update_rule_attribute(attr_name, checked))
elif isinstance(editor_widget, QSpinBox):
editor_widget.valueChanged.connect(lambda value: self._update_rule_attribute(attr_name, value))
elif isinstance(editor_widget, QDoubleSpinBox):
editor_widget.valueChanged.connect(lambda value: self._update_rule_attribute(attr_name, value))
# Add connections for other widget types
def _update_rule_attribute(self, attr_name, value):
"""
Updates the attribute of the current rule object and emits the signal.
"""
if self.current_rule_object:
# Basic type conversion based on the original attribute type
original_value = getattr(self.current_rule_object, attr_name)
try:
if isinstance(original_value, bool):
converted_value = bool(value)
elif isinstance(original_value, int):
converted_value = int(value)
elif isinstance(original_value, float):
converted_value = float(value)
elif isinstance(original_value, (str, type(None))):
converted_value = str(value) if value != "" else None # Convert empty string to None for original None types
else:
converted_value = value # Fallback for other types
setattr(self.current_rule_object, attr_name, converted_value)
self.rule_updated.emit(self.current_rule_object)
# print(f"Updated {attr_name} to {converted_value} in {self.current_rule_type}") # Debugging
except ValueError:
# Handle potential conversion errors (e.g., non-numeric input for int/float)
print(f"Error converting value '{value}' for attribute '{attr_name}'")
# Optionally, revert the editor widget to the original value or show an error indicator
def clear_editor(self):
"""
Clears the form layout.
"""
self.current_rule_object = None
self.current_rule_type = None
self.rule_type_label.setText("Select an item in the hierarchy to view/edit rules.")
while self.form_layout.rowCount() > 0:
self.form_layout.removeRow(0)
if __name__ == '__main__':
app = QApplication(sys.argv)
# Placeholder Rule Classes for testing
from dataclasses import dataclass, field
@dataclass
class SourceRule:
source_setting_1: str = "default_source_string"
source_setting_2: int = 123
source_setting_3: bool = True
@dataclass
class AssetRule:
asset_setting_a: float = 4.56
asset_setting_b: str = None
asset_setting_c: bool = False
@dataclass
class FileRule:
file_setting_x: int = 789
file_setting_y: str = "default_file_string"
editor = RuleEditorWidget()
# Test loading different rule types
source_rule = SourceRule()
asset_rule = AssetRule()
file_rule = FileRule()
editor.load_rule(source_rule, "SourceRule")
# editor.load_rule(asset_rule, "AssetRule")
# editor.load_rule(file_rule, "FileRule")
editor.show()
sys.exit(app.exec())

184
gui/rule_hierarchy_model.py Normal file
View File

@ -0,0 +1,184 @@
from PySide6.QtCore import QAbstractItemModel, QModelIndex, Qt, Signal, Slot
from PySide6.QtGui import QIcon # Assuming we might want icons later
from rule_structure import SourceRule, AssetRule, FileRule # Import rule structures
class RuleHierarchyModel(QAbstractItemModel):
"""
A custom model for displaying the hierarchical structure of SourceRule,
AssetRule, and FileRule objects in a QTreeView.
"""
def __init__(self, root_rule: SourceRule = None, parent=None):
super().__init__(parent)
self._root_rule = root_rule
def set_root_rule(self, root_rule: SourceRule):
"""Sets the root SourceRule for the model and resets the model."""
self.beginResetModel()
self._root_rule = root_rule
self.endResetModel()
def rowCount(self, parent: QModelIndex = QModelIndex()):
"""Returns the number of rows (children) for the given parent index."""
if not parent.isValid():
# Root item (SourceRule)
return 1 if self._root_rule else 0
else:
parent_item = parent.internalPointer()
if isinstance(parent_item, SourceRule):
# Children of SourceRule are AssetRules
return len(parent_item.assets)
elif isinstance(parent_item, AssetRule):
# Children of AssetRule are FileRules
return len(parent_item.files)
elif isinstance(parent_item, FileRule):
# FileRules have no children
return 0
else:
return 0
def columnCount(self, parent: QModelIndex = QModelIndex()):
"""Returns the number of columns."""
return 1 # We only need one column for the hierarchy name
def data(self, index: QModelIndex, role: int = Qt.ItemDataRole.DisplayRole):
"""Returns the data for the given index and role."""
if not index.isValid():
return None
item = index.internalPointer()
if role == Qt.ItemDataRole.DisplayRole:
if isinstance(item, SourceRule):
return f"Source: {item.input_path}" # Or some other identifier
elif isinstance(item, AssetRule):
return f"Asset: {item.asset_name}" # Or some other identifier
elif isinstance(item, FileRule):
return f"File: {item.file_path}" # Or some other identifier
else:
return None
# Add other roles as needed (e.g., Qt.ItemDataRole.DecorationRole for icons)
# elif role == Qt.ItemDataRole.DecorationRole:
# if isinstance(item, SourceRule):
# return QIcon("icons/source.png") # Placeholder icon
# elif isinstance(item, AssetRule):
# return QIcon("icons/asset.png") # Placeholder icon
# elif isinstance(item, FileRule):
# return QIcon("icons/file.png") # Placeholder icon
# else:
# return None
return None
def index(self, row: int, column: int, parent: QModelIndex = QModelIndex()):
"""Returns the model index for the given row, column, and parent index."""
if not self.hasIndex(row, column, parent):
return QModelIndex()
if not parent.isValid():
# Requesting index for the root item (SourceRule)
if self._root_rule and row == 0:
return self.createIndex(row, column, self._root_rule)
else:
return QModelIndex()
else:
parent_item = parent.internalPointer()
if isinstance(parent_item, SourceRule):
# Children are AssetRules
if 0 <= row < len(parent_item.assets):
child_item = parent_item.assets[row]
return self.createIndex(row, column, child_item)
else:
return QModelIndex()
elif isinstance(parent_item, AssetRule):
# Children are FileRules
if 0 <= row < len(parent_item.files):
child_item = parent_item.files[row]
return self.createIndex(row, column, child_item)
else:
return QModelIndex()
else:
return QModelIndex() # Should not happen for FileRule parents
def parent(self, index: QModelIndex):
"""Returns the parent index for the given index."""
if not index.isValid():
return QModelIndex()
child_item = index.internalPointer()
if isinstance(child_item, SourceRule):
# SourceRule is the root, has no parent in the model hierarchy
return QModelIndex()
elif isinstance(child_item, AssetRule):
# Find the SourceRule that contains this AssetRule
if self._root_rule and child_item in self._root_rule.assets:
# The row of the SourceRule is always 0 in this model
return self.createIndex(0, 0, self._root_rule)
else:
return QModelIndex() # Should not happen if data is consistent
elif isinstance(child_item, FileRule):
# Find the AssetRule that contains this FileRule
if self._root_rule:
for asset_row, asset_rule in enumerate(self._root_rule.assets):
if child_item in asset_rule.files:
# The row of the parent AssetRule within the SourceRule's children
return self.createIndex(asset_row, 0, asset_rule)
return QModelIndex() # Should not happen if data is consistent
else:
return QModelIndex() # Unknown item type
def headerData(self, section: int, orientation: Qt.Orientation, role: int = Qt.ItemDataRole.DisplayRole):
"""Returns the data for the header."""
if orientation == Qt.Orientation.Horizontal and role == Qt.ItemDataRole.DisplayRole:
if section == 0:
return "Hierarchy"
return None
def get_item_from_index(self, index: QModelIndex):
"""Helper to get the underlying rule object from a model index."""
if index.isValid():
return index.internalPointer()
return None
if __name__ == '__main__':
# Example Usage (for testing the model)
from PySide6.QtWidgets import QApplication, QTreeView
from dataclasses import dataclass, field
# Define placeholder rule structures if not imported
@dataclass
class FileRule:
name: str = "file"
setting_f1: str = "value1"
setting_f2: int = 10
@dataclass
class AssetRule:
name: str = "asset"
files: list[FileRule] = field(default_factory=list)
setting_a1: bool = True
setting_a2: float = 3.14
@dataclass
class SourceRule:
name: str = "source"
assets: list[AssetRule] = field(default_factory=list)
setting_s1: str = "hello"
# Create a sample hierarchical structure
file1 = FileRule(name="texture_diffuse.png")
file2 = FileRule(name="texture_normal.png")
file3 = FileRule(name="model.obj")
asset1 = AssetRule(name="Material_01", files=[file1, file2])
asset2 = AssetRule(name="Model_01", files=[file3])
source_rule_instance = SourceRule(name="Input_Archive", assets=[asset1, asset2])
app = QApplication([])
tree_view = QTreeView()
model = RuleHierarchyModel(source_rule_instance)
tree_view.setModel(model)
tree_view.setWindowTitle("Rule Hierarchy Example")
tree_view.show()
app.exec()

53
rule_structure.py Normal file
View File

@ -0,0 +1,53 @@
import dataclasses
import json
from typing import List, Dict, Any, Tuple
@dataclasses.dataclass
class FileRule:
file_path: str = None
map_type_override: str = None
resolution_override: Tuple[int, int] = None
channel_merge_instructions: Dict[str, Any] = dataclasses.field(default_factory=dict)
output_format_override: str = None # Potentially others identified during integration
def to_json(self) -> str:
return json.dumps(dataclasses.asdict(self), indent=4)
@classmethod
def from_json(cls, json_string: str) -> 'FileRule':
data = json.loads(json_string)
return cls(**data)
@dataclasses.dataclass
class AssetRule:
asset_name: str = None
asset_type: str = None
common_metadata: Dict[str, Any] = dataclasses.field(default_factory=dict)
files: List[FileRule] = dataclasses.field(default_factory=list)
def to_json(self) -> str:
return json.dumps(dataclasses.asdict(self), indent=4)
@classmethod
def from_json(cls, json_string: str) -> 'AssetRule':
data = json.loads(json_string)
# Manually deserialize nested FileRule objects
data['files'] = [FileRule.from_json(json.dumps(file_data)) for file_data in data.get('files', [])]
return cls(**data)
@dataclasses.dataclass
class SourceRule:
supplier_identifier: str = None
high_level_sorting_parameters: Dict[str, Any] = dataclasses.field(default_factory=dict)
assets: List[AssetRule] = dataclasses.field(default_factory=list)
input_path: str = None
def to_json(self) -> str:
return json.dumps(dataclasses.asdict(self), indent=4)
@classmethod
def from_json(cls, json_string: str) -> 'SourceRule':
data = json.loads(json_string)
# Manually deserialize nested AssetRule objects
data['assets'] = [AssetRule.from_json(json.dumps(asset_data)) for asset_data in data.get('assets', [])]
return cls(**data)