Major Terminogy unification and refactor [Needs thorough testing]

This commit is contained in:
Rusfort 2025-05-06 18:26:26 +02:00
parent 0a3100d448
commit ff548e902e
20 changed files with 689 additions and 356 deletions

View File

@ -9,7 +9,7 @@ The tool's core settings are now stored in `config/app_settings.json`. This JSON
The `configuration.py` module is responsible for loading the settings from `app_settings.json` (including loading and saving the JSON content), merging them with the rules from the selected preset file, and providing the base configuration via the `load_base_config()` function. Note that the old `config.py` file has been deleted. The `configuration.py` module is responsible for loading the settings from `app_settings.json` (including loading and saving the JSON content), merging them with the rules from the selected preset file, and providing the base configuration via the `load_base_config()` function. Note that the old `config.py` file has been deleted.
The `app_settings.json` file is structured into several key sections, including: The `app_settings.json` file is structured into several key sections, including:
* `FILE_TYPE_DEFINITIONS`: Defines known file types (like different texture maps, models, etc.) and their properties. Each definition now includes a `"standard_type"` key for aliasing to a common type and a `"bit_depth_rule"` key specifying how to handle bit depth for this file type. The separate `MAP_BIT_DEPTH_RULES` section has been removed. * `FILE_TYPE_DEFINITIONS`: Defines known file types (like different texture maps, models, etc.) and their properties. Each definition now includes a `"standard_type"` key for aliasing to a common type (e.g., "COL" for color maps, "NRM" for normal maps), an `"is_grayscale"` boolean property, and a `"bit_depth_rule"` key specifying how to handle bit depth for this file type. The separate `MAP_BIT_DEPTH_RULES` section has been removed. For users creating or editing presets, it's important to note that internal mapping rules (like `Map_type_Mapping.target_type` within a preset's `FileRule`) now directly use the main keys from these `FILE_TYPE_DEFINITIONS` (e.g., `"MAP_COL"`, `"MAP_RGH"`), not just the `standard_type` aliases.
* `ASSET_TYPE_DEFINITIONS`: Defines known asset types (like Surface, Model, Decal) and their properties. * `ASSET_TYPE_DEFINITIONS`: Defines known asset types (like Surface, Model, Decal) and their properties.
* `MAP_MERGE_RULES`: Defines how multiple input maps can be merged into a single output map (e.g., combining Normal and Roughness into one). * `MAP_MERGE_RULES`: Defines how multiple input maps can be merged into a single output map (e.g., combining Normal and Roughness into one).
@ -40,7 +40,7 @@ Preset files define supplier-specific rules for interpreting asset source files.
* Presets are located in the `presets/` directory. * Presets are located in the `presets/` directory.
* Each preset is a JSON file named after the supplier (e.g., `Poliigon.json`). * Each preset is a JSON file named after the supplier (e.g., `Poliigon.json`).
* Presets contain rules based on filename patterns and keywords to identify map types, models, and other files. * Presets contain rules based on filename patterns and keywords to identify map types, models, and other files.
* They also define how variants (like different resolutions or bit depths) are handled and how asset names and categories are determined from the source filename. * They also define how variants (like different resolutions or bit depths) are handled and how asset names and categories are determined from the source filename. When defining `map_type_mapping` rules within a preset, the `target_type` field must now use a valid key from the `FILE_TYPE_DEFINITIONS` in `config/app_settings.json` (e.g., `"MAP_AO"` instead of a custom alias like `"AO"`).
When processing assets, you must specify which preset to use. The tool then loads the core settings from `config/app_settings.json` and merges them with the rules from the selected preset to determine how to process the input. When processing assets, you must specify which preset to use. The tool then loads the core settings from `config/app_settings.json` and merges them with the rules from the selected preset to determine how to process the input.

View File

@ -23,7 +23,7 @@ The following tokens can be used in both `OUTPUT_DIRECTORY_PATTERN` and `OUTPUT_
* `[Time]`: Current time (`HHMMSS`). * `[Time]`: Current time (`HHMMSS`).
* `[Sha5]`: The first 5 characters of the SHA-256 hash of the original input source file (e.g., the source zip archive). * `[Sha5]`: The first 5 characters of the SHA-256 hash of the original input source file (e.g., the source zip archive).
* `[ApplicationPath]`: Absolute path to the application directory. * `[ApplicationPath]`: Absolute path to the application directory.
* `[maptype]`: Specific map type (e.g., `Albedo`, `Normal`). (Primarily for filename pattern) * `[maptype]`: The standardized map type identifier (e.g., `COL` for Color/Albedo, `NRM` for Normal, `RGH` for Roughness). This is derived from the `standard_type` defined in the application's `FILE_TYPE_DEFINITIONS` (see `config/app_settings.json`) and may include a variant suffix if applicable. (Primarily for filename pattern)
* `[dimensions]`: Pixel dimensions (e.g., `2048x2048`). * `[dimensions]`: Pixel dimensions (e.g., `2048x2048`).
* `[bitdepth]`: Output bit depth (e.g., `8bit`, `16bit`). * `[bitdepth]`: Output bit depth (e.g., `8bit`, `16bit`).
* `[category]`: Asset category determined by preset rules. * `[category]`: Asset category determined by preset rules.

View File

@ -62,8 +62,8 @@ The GUI includes a dedicated editor for modifying the `config/app_settings.json`
Preset files are the primary way to adapt the tool to new asset sources. Developers should use `Presets/_template.json` as a starting point. Key fields include: Preset files are the primary way to adapt the tool to new asset sources. Developers should use `Presets/_template.json` as a starting point. Key fields include:
* `supplier_name`: The name of the asset source (e.g., `"Poliigon"`). Used for output directory naming. * `supplier_name`: The name of the asset source (e.g., `"Poliigon"`). Used for output directory naming.
* `map_type_mapping`: A list of dictionaries, each mapping source filename patterns/keywords to a standard internal map type (defined in `config.py`). * `map_type_mapping`: A list of dictionaries, each mapping source filename patterns/keywords to a specific file type. The `target_type` for this mapping **must** be a key from `FILE_TYPE_DEFINITIONS` located in `config/app_settings.json`.
* `target_type`: The standard internal map type (e.g., `"COL"`, `"NRM"`). * `target_type`: The specific file type key from `FILE_TYPE_DEFINITIONS` (e.g., `"MAP_COL"`, `"MAP_NORM_GL"`, `"MAP_RGH"`). This replaces previous alias-based systems. The common aliases like "COL" or "NRM" are now derived from the `standard_type` property within `FILE_TYPE_DEFINITIONS` but are not used directly for `target_type`.
* `keywords`: A list of filename patterns (regex or fnmatch-style wildcards) used to identify this map type. The order of keywords within this list, and the order of dictionaries in the `map_type_mapping` list, determines the priority for assigning variant suffixes (`-1`, `-2`, etc.) when multiple files match the same `target_type`. * `keywords`: A list of filename patterns (regex or fnmatch-style wildcards) used to identify this map type. The order of keywords within this list, and the order of dictionaries in the `map_type_mapping` list, determines the priority for assigning variant suffixes (`-1`, `-2`, etc.) when multiple files match the same `target_type`.
* `bit_depth_variants`: A dictionary mapping standard map types (e.g., `"NRM"`) to a pattern identifying its high bit-depth variant (e.g., `"*_NRM16*.tif"`). Files matching these patterns are prioritized over their standard counterparts. * `bit_depth_variants`: A dictionary mapping standard map types (e.g., `"NRM"`) to a pattern identifying its high bit-depth variant (e.g., `"*_NRM16*.tif"`). Files matching these patterns are prioritized over their standard counterparts.
* `map_bit_depth_rules`: Defines how to handle the bit depth of source maps. Can specify a default behavior (`"respect"` or `"force_8bit"`) and overrides for specific map types. * `map_bit_depth_rules`: Defines how to handle the bit depth of source maps. Can specify a default behavior (`"respect"` or `"force_8bit"`) and overrides for specific map types.

View File

@ -20,11 +20,11 @@ The pipeline steps are:
3. **File Inventory (`_inventory_and_classify_files`)**: 3. **File Inventory (`_inventory_and_classify_files`)**:
* Scans the contents of the *already prepared* temporary workspace. * Scans the contents of the *already prepared* temporary workspace.
* This step primarily inventories the files present. The *classification* (determining `item_type`, etc.) is taken directly from the input `SourceRule`. * This step primarily inventories the files present. The *classification* (determining `item_type`, etc.) is taken directly from the input `SourceRule`. The `item_type` for each file (within the `FileRule` objects of the `SourceRule`) is expected to be a key from `Configuration.FILE_TYPE_DEFINITIONS`.
* Stores the file paths and their associated rules from the `SourceRule` in `self.classified_files`. * Stores the file paths and their associated rules from the `SourceRule` in `self.classified_files`.
4. **Base Metadata Determination (`_determine_base_metadata`, `_determine_single_asset_metadata`)**: 4. **Base Metadata Determination (`_determine_base_metadata`, `_determine_single_asset_metadata`)**:
* Determines the base asset name, category, and archetype using the explicit values provided in the input `SourceRule` and the static `Configuration`. Overrides (like `supplier_identifier`, `asset_type`, `asset_name_override`) are taken directly from the `SourceRule`. * Determines the base asset name, category, and archetype using the explicit values provided in the input `SourceRule` and the static `Configuration`. Overrides (like `supplier_identifier`, `asset_type`, `asset_name_override`) are taken directly from the `SourceRule`. The `asset_type` (within the `AssetRule` object of the `SourceRule`) is expected to be a key from `Configuration.ASSET_TYPE_DEFINITIONS`.
5. **Skip Check**: 5. **Skip Check**:
* If the `overwrite` flag is `False`, checks if the final output directory already exists and contains `metadata.json`. * If the `overwrite` flag is `False`, checks if the final output directory already exists and contains `metadata.json`.
@ -43,7 +43,7 @@ The pipeline steps are:
7. **Map Merging (`_merge_maps_from_source`)**: 7. **Map Merging (`_merge_maps_from_source`)**:
* Iterates through `MAP_MERGE_RULES` in `Configuration`. * Iterates through `MAP_MERGE_RULES` in `Configuration`.
* Identifies required source maps by checking the `item_type_override` within the `SourceRule` (specifically in the `FileRule` for each file). Files with a base `item_type` of `"FILE_IGNORE"` are explicitly excluded from consideration. * Identifies required source maps by checking the `item_type_override` within the `SourceRule` (specifically in the `FileRule` for each file). Both `item_type` and `item_type_override` are expected to be keys from `Configuration.FILE_TYPE_DEFINITIONS`. Files with a base `item_type` of `"FILE_IGNORE"` are explicitly excluded from consideration.
* Loads source channels, handling missing inputs with defaults from `Configuration` or `SourceRule`. * Loads source channels, handling missing inputs with defaults from `Configuration` or `SourceRule`.
* Merges channels (`cv2.merge`). * Merges channels (`cv2.merge`).
* Determines output format/bit depth and saves the merged map. * Determines output format/bit depth and saves the merged map.

View File

@ -20,7 +20,7 @@ This document outlines the coding conventions and general practices followed wit
* Use Qt's signals and slots mechanism for communication between objects, especially across threads. * Use Qt's signals and slots mechanism for communication between objects, especially across threads.
* Run long-running or blocking tasks in separate `QThread`s to keep the main UI thread responsive. * Run long-running or blocking tasks in separate `QThread`s to keep the main UI thread responsive.
* Perform UI updates only from the main UI thread. * Perform UI updates only from the main UI thread.
* **Configuration:** Core settings are managed in `config.py` (Python module). Supplier-specific rules are managed in JSON files (`Presets/`). The `Configuration` class handles loading and merging these. * **Configuration:** Core application settings are defined in `config/app_settings.json`. Supplier-specific rules are managed in JSON files within the `Presets/` directory. The `Configuration` class (`configuration.py`) is responsible for loading `app_settings.json` and merging it with the selected preset file.
* **File Paths:** Use `pathlib.Path` objects for handling file system paths. Avoid using string manipulation for path joining or parsing. * **File Paths:** Use `pathlib.Path` objects for handling file system paths. Avoid using string manipulation for path joining or parsing.
* **Docstrings:** Write clear and concise docstrings for modules, classes, methods, and functions, explaining their purpose, arguments, and return values. * **Docstrings:** Write clear and concise docstrings for modules, classes, methods, and functions, explaining their purpose, arguments, and return values.
* **Comments:** Use comments to explain complex logic or non-obvious parts of the code. * **Comments:** Use comments to explain complex logic or non-obvious parts of the code.
@ -31,4 +31,51 @@ This document outlines the coding conventions and general practices followed wit
* Use `UPPER_CASE` for constants. * Use `UPPER_CASE` for constants.
* Use a leading underscore (`_`) for internal or "protected" methods/attributes. * Use a leading underscore (`_`) for internal or "protected" methods/attributes.
## Terminology and Data Standards
To ensure consistency and clarity across the codebase, particularly concerning asset and file classifications, the following standards must be adhered to. These primarily revolve around definitions stored in `config/app_settings.json`.
### `FILE_TYPE_DEFINITIONS`
`FILE_TYPE_DEFINITIONS` in `config/app_settings.json` is the **single source of truth** for all file type identifiers used within the application.
* **`FileRule.item_type` and `FileRule.item_type_override`**: When defining or interpreting `SourceRule` objects (and their constituent `FileRule` instances), the `item_type` and `item_type_override` attributes **must** always use a key directly from `FILE_TYPE_DEFINITIONS`.
* Example: `file_rule.item_type = "MAP_COL"` (for a color map) or `file_rule.item_type = "MODEL_FBX"` (for an FBX model).
* **`standard_type` Property**: Each entry in `FILE_TYPE_DEFINITIONS` includes a `standard_type` property. This provides a common, often abbreviated, alias for the file type.
* Example: `FILE_TYPE_DEFINITIONS["MAP_COL"]["standard_type"]` might be `"COL"`.
* Example: `FILE_TYPE_DEFINITIONS["MAP_NORM_GL"]["standard_type"]` might be `"NRM"`.
* **Removal of `STANDARD_MAP_TYPES`**: The global constant `STANDARD_MAP_TYPES` (previously in `config.py`) has been **removed**. Standard map type aliases (e.g., "COL", "NRM", "RGH") are now derived dynamically from the `standard_type` property of the relevant entry in `FILE_TYPE_DEFINITIONS`.
* **`map_type` Usage**:
* **Filename Tokens**: When used as a token in output filename patterns (e.g., `[maptype]`), `map_type` is typically derived from the `standard_type` of the file's effective `item_type`. It may also include a variant suffix if applicable (e.g., "COL", "COL_var01").
* **General Classification**: For precise classification within code logic or rules, developers should refer to the full `FILE_TYPE_DEFINITIONS` key (e.g., `"MAP_COL"`, `"MAP_METAL"`). The `standard_type` can be used for broader categorization or when a common alias is needed.
* **`Map_type_Mapping.target_type` in Presets**: Within preset files (e.g., `Presets/Poliigon.json`), the `map_type_mapping` rules found inside a `FileRule`'s `map_processing_options` now use keys from `FILE_TYPE_DEFINITIONS` for the `target_type` field.
* Example:
```json
// Inside a FileRule in a preset
"map_processing_options": {
"map_type_mapping": {
"source_type_pattern": ".*ambient occlusion.*",
"target_type": "MAP_AO", // Uses FILE_TYPE_DEFINITIONS key
"source_channels": "RGB"
}
}
```
This replaces old aliases like `"AO"` or `"OCC"`.
* **`is_grayscale` Property**: `FILE_TYPE_DEFINITIONS` entries can now include an `is_grayscale` boolean property. This flag indicates whether the file type is inherently grayscale (e.g., a roughness map). It can be used by the processing engine to inform decisions about channel handling, compression, or specific image operations.
* Example: `FILE_TYPE_DEFINITIONS["MAP_RGH"]["is_grayscale"]` might be `true`.
### `ASSET_TYPE_DEFINITIONS`
Similarly, `ASSET_TYPE_DEFINITIONS` in `config/app_settings.json` is the **single source of truth** for all asset type identifiers.
* **`AssetRule.asset_type`, `AssetRule.asset_type_override`, and `AssetRule.asset_category`**: When defining or interpreting `SourceRule` objects (and their constituent `AssetRule` instances), the `asset_type`, `asset_type_override`, and `asset_category` attributes **must** always use a key directly from `ASSET_TYPE_DEFINITIONS`.
* Example: `asset_rule.asset_type = "SURFACE_3D"` or `asset_rule.asset_category = "FABRIC"`.
Adherence to these definitions ensures that terminology remains consistent throughout the application, from configuration to core logic and output.
Adhering to these conventions will make the codebase more consistent, easier to understand, and more maintainable for all contributors. Adhering to these conventions will make the codebase more consistent, easier to understand, and more maintainable for all contributors.

View File

@ -29,7 +29,7 @@
], ],
"map_type_mapping": [ "map_type_mapping": [
{ {
"target_type": "COL", "target_type": "MAP_COL",
"keywords": [ "keywords": [
"COLOR*", "COLOR*",
"COL", "COL",
@ -40,7 +40,7 @@
] ]
}, },
{ {
"target_type": "NRM", "target_type": "MAP_NRM",
"keywords": [ "keywords": [
"NORMAL*", "NORMAL*",
"NORM*", "NORM*",
@ -49,27 +49,27 @@
] ]
}, },
{ {
"target_type": "ROUGH", "target_type": "MAP_ROUGH",
"keywords": [ "keywords": [
"ROUGHNESS", "ROUGHNESS",
"ROUGH" "ROUGH"
] ]
}, },
{ {
"target_type": "ROUGH", "target_type": "MAP_ROUGH",
"keywords": [ "keywords": [
"GLOSS" "GLOSS"
] ]
}, },
{ {
"target_type": "AO", "target_type": "MAP_AO",
"keywords": [ "keywords": [
"AMBIENTOCCLUSION", "AMBIENTOCCLUSION",
"AO" "AO"
] ]
}, },
{ {
"target_type": "DISP", "target_type": "MAP_DISP",
"keywords": [ "keywords": [
"DISPLACEMENT", "DISPLACEMENT",
"DISP", "DISP",
@ -78,7 +78,7 @@
] ]
}, },
{ {
"target_type": "REFL", "target_type": "MAP_REFL",
"keywords": [ "keywords": [
"REFLECTION", "REFLECTION",
"REFL", "REFL",
@ -87,26 +87,26 @@
] ]
}, },
{ {
"target_type": "SSS", "target_type": "MAP_SSS",
"keywords": [ "keywords": [
"SSS", "SSS",
"SUBSURFACE*" "SUBSURFACE*"
] ]
}, },
{ {
"target_type": "FUZZ", "target_type": "MAP_FUZZ",
"keywords": [ "keywords": [
"FUZZ" "FUZZ"
] ]
}, },
{ {
"target_type": "IDMAP", "target_type": "MAP_IDMAP",
"keywords": [ "keywords": [
"IDMAP" "IDMAP"
] ]
}, },
{ {
"target_type": "MASK", "target_type": "MAP_MASK",
"keywords": [ "keywords": [
"OPAC*", "OPAC*",
"TRANSP*", "TRANSP*",
@ -115,7 +115,7 @@
] ]
}, },
{ {
"target_type": "METAL", "target_type": "MAP_METAL",
"keywords": [ "keywords": [
"METAL*", "METAL*",
"METALLIC" "METALLIC"

View File

@ -29,7 +29,7 @@
], ],
"map_type_mapping": [ "map_type_mapping": [
{ {
"target_type": "COL", "target_type": "MAP_COL",
"keywords": [ "keywords": [
"COLOR*", "COLOR*",
"COL", "COL",
@ -39,7 +39,7 @@
] ]
}, },
{ {
"target_type": "NRM", "target_type": "MAP_NRM",
"keywords": [ "keywords": [
"NORMAL*", "NORMAL*",
"NORM*", "NORM*",
@ -47,28 +47,28 @@
] ]
}, },
{ {
"target_type": "ROUGH", "target_type": "MAP_ROUGH",
"keywords": [ "keywords": [
"ROUGHNESS", "ROUGHNESS",
"ROUGH" "ROUGH"
] ]
}, },
{ {
"target_type": "ROUGH", "target_type": "MAP_ROUGH",
"keywords": [ "keywords": [
"GLOSS" "GLOSS"
], ],
"is_gloss_source": true "is_gloss_source": true
}, },
{ {
"target_type": "AO", "target_type": "MAP_AO",
"keywords": [ "keywords": [
"AMBIENTOCCLUSION", "AMBIENTOCCLUSION",
"AO" "AO"
] ]
}, },
{ {
"target_type": "DISP", "target_type": "MAP_DISP",
"keywords": [ "keywords": [
"DISPLACEMENT", "DISPLACEMENT",
"DISP", "DISP",
@ -77,7 +77,7 @@
] ]
}, },
{ {
"target_type": "REFL", "target_type": "MAP_REFL",
"keywords": [ "keywords": [
"REFLECTION", "REFLECTION",
"REFL", "REFL",
@ -86,26 +86,26 @@
] ]
}, },
{ {
"target_type": "SSS", "target_type": "MAP_SSS",
"keywords": [ "keywords": [
"SSS", "SSS",
"SUBSURFACE*" "SUBSURFACE*"
] ]
}, },
{ {
"target_type": "FUZZ", "target_type": "MAP_FUZZ",
"keywords": [ "keywords": [
"FUZZ" "FUZZ"
] ]
}, },
{ {
"target_type": "IDMAP", "target_type": "MAP_IDMAP",
"keywords": [ "keywords": [
"IDMAP" "IDMAP"
] ]
}, },
{ {
"target_type": "MASK", "target_type": "MAP_MASK",
"keywords": [ "keywords": [
"OPAC*", "OPAC*",
"TRANSP*", "TRANSP*",
@ -114,7 +114,7 @@
] ]
}, },
{ {
"target_type": "METAL", "target_type": "MAP_METAL",
"keywords": [ "keywords": [
"METAL*", "METAL*",
"METALLIC" "METALLIC"

View File

@ -29,7 +29,7 @@
], ],
"map_type_mapping": [ "map_type_mapping": [
{ {
"target_type": "COL", "target_type": "MAP_COL",
"keywords": [ "keywords": [
"COLOR*", "COLOR*",
"COL", "COL",
@ -39,7 +39,7 @@
] ]
}, },
{ {
"target_type": "NRM", "target_type": "MAP_NRM",
"keywords": [ "keywords": [
"NORMAL*", "NORMAL*",
"NORM*", "NORM*",
@ -48,27 +48,27 @@
] ]
}, },
{ {
"target_type": "ROUGH", "target_type": "MAP_ROUGH",
"keywords": [ "keywords": [
"ROUGHNESS", "ROUGHNESS",
"ROUGH" "ROUGH"
] ]
}, },
{ {
"target_type": "ROUGH", "target_type": "MAP_ROUGH",
"keywords": [ "keywords": [
"GLOSS" "GLOSS"
] ]
}, },
{ {
"target_type": "AO", "target_type": "MAP_AO",
"keywords": [ "keywords": [
"AMBIENTOCCLUSION", "AMBIENTOCCLUSION",
"AO" "AO"
] ]
}, },
{ {
"target_type": "DISP", "target_type": "MAP_DISP",
"keywords": [ "keywords": [
"DISPLACEMENT", "DISPLACEMENT",
"DISP", "DISP",
@ -77,7 +77,7 @@
] ]
}, },
{ {
"target_type": "REFL", "target_type": "MAP_REFL",
"keywords": [ "keywords": [
"REFLECTION", "REFLECTION",
"REFL", "REFL",
@ -86,27 +86,27 @@
] ]
}, },
{ {
"target_type": "SSS", "target_type": "MAP_SSS",
"keywords": [ "keywords": [
"SSS", "SSS",
"SUBSURFACE*" "SUBSURFACE*"
] ]
}, },
{ {
"target_type": "FUZZ", "target_type": "MAP_FUZZ",
"keywords": [ "keywords": [
"FUZZ" "FUZZ"
] ]
}, },
{ {
"target_type": "IDMAP", "target_type": "MAP_IDMAP",
"keywords": [ "keywords": [
"ID*", "ID*",
"IDMAP" "IDMAP"
] ]
}, },
{ {
"target_type": "MASK", "target_type": "MAP_MASK",
"keywords": [ "keywords": [
"OPAC*", "OPAC*",
"TRANS*", "TRANS*",
@ -115,7 +115,7 @@
] ]
}, },
{ {
"target_type": "METAL", "target_type": "MAP_METAL",
"keywords": [ "keywords": [
"METALNESS_", "METALNESS_",
"METALLIC" "METALLIC"

View File

@ -55,9 +55,9 @@ REFERENCE_MAP_TYPES = ["COL", "COL-1", "COL-2"]
REFERENCE_RESOLUTION_ORDER = ["1K", "512", "2K", "4K"] # Adjust as needed REFERENCE_RESOLUTION_ORDER = ["1K", "512", "2K", "4K"] # Adjust as needed
# Assumed filename pattern for processed images. # Assumed filename pattern for processed images.
# {asset_name}, {map_type}, {resolution}, {format} will be replaced. # [assetname], [maptype], [resolution], [ext] will be replaced.
# Check Asset Processor Tool's config.py (TARGET_FILENAME_PATTERN) if this is wrong. # This should match OUTPUT_FILENAME_PATTERN from app_settings.json.
IMAGE_FILENAME_PATTERN = "{asset_name}_{map_type}_{resolution}.{format}" IMAGE_FILENAME_PATTERN = "[assetname]_[maptype]_[resolution].[ext]"
# Fallback extensions to try if the primary format from metadata is not found # Fallback extensions to try if the primary format from metadata is not found
# Order matters - first found will be used. # Order matters - first found will be used.
@ -126,10 +126,10 @@ def reconstruct_image_path_with_fallback(asset_dir_path, asset_name, map_type, r
if primary_format: if primary_format:
try: try:
filename = IMAGE_FILENAME_PATTERN.format( filename = IMAGE_FILENAME_PATTERN.format(
asset_name=asset_name, assetname=asset_name, # Token is 'assetname'
map_type=map_type, maptype=map_type, # Token is 'maptype'
resolution=resolution, resolution=resolution, # Token is 'resolution'
format=primary_format.lower() # Ensure format is lowercase ext=primary_format.lower() # Token is 'ext'
) )
primary_path = asset_dir_path / filename primary_path = asset_dir_path / filename
if primary_path.is_file(): if primary_path.is_file():
@ -151,10 +151,10 @@ def reconstruct_image_path_with_fallback(asset_dir_path, asset_name, map_type, r
continue continue
try: try:
fallback_filename = IMAGE_FILENAME_PATTERN.format( fallback_filename = IMAGE_FILENAME_PATTERN.format(
asset_name=asset_name, assetname=asset_name, # Token is 'assetname'
map_type=map_type, maptype=map_type, # Token is 'maptype'
resolution=resolution, resolution=resolution, # Token is 'resolution'
format=ext.lower() ext=ext.lower() # Token is 'ext'
) )
fallback_path = asset_dir_path / fallback_filename fallback_path = asset_dir_path / fallback_filename
if fallback_path.is_file(): if fallback_path.is_file():

View File

@ -52,9 +52,9 @@ HIGHEST_RESOLUTION_NODE_LABEL = "HighestResolution" # Value node to store highes
ENABLE_MANIFEST = False # Disabled based on user feedback in previous run ENABLE_MANIFEST = False # Disabled based on user feedback in previous run
# Assumed filename pattern for processed images. # Assumed filename pattern for processed images.
# {asset_name}, {map_type}, {resolution}, {format} will be replaced. # [assetname], [maptype], [resolution], [ext] will be replaced.
# Check Asset Processor Tool's config.py (TARGET_FILENAME_PATTERN) if this is wrong. # This should match OUTPUT_FILENAME_PATTERN from app_settings.json.
IMAGE_FILENAME_PATTERN = "{asset_name}_{map_type}_{resolution}.{format}" IMAGE_FILENAME_PATTERN = "[assetname]_[maptype]_[resolution].[ext]"
# Fallback extensions to try if the primary format from metadata is not found # Fallback extensions to try if the primary format from metadata is not found
# Order matters - first found will be used. # Order matters - first found will be used.
@ -150,10 +150,32 @@ def add_tag_if_new(asset_data, tag_name):
def get_color_space(map_type): def get_color_space(map_type):
"""Returns the appropriate Blender color space name for a given map type string.""" """Returns the appropriate Blender color space name for a given map type string."""
# Handle potential numbered variants like COL-1, COL-2 # Attempt to map map_type (e.g., "MAP_COL", "COL-1", "NRMRGH") to a standard type for color space lookup.
base_map_type = map_type.split('-')[0] # PBR_COLOR_SPACE_MAP usually contains standard types like "COL", "NRM".
return PBR_COLOR_SPACE_MAP.get(map_type.upper(), # Check full name first (e.g., NRMRGH) map_type_upper = map_type.upper()
PBR_COLOR_SPACE_MAP.get(base_map_type.upper(), DEFAULT_COLOR_SPACE)) # Fallback to base type
# 1. Direct match (e.g., "NRMRGH", "COL")
if map_type_upper in PBR_COLOR_SPACE_MAP:
return PBR_COLOR_SPACE_MAP[map_type_upper]
# 2. Handle variants like "COL-1", "MAP_ROUGH-2"
# Try to get the part before a hyphen if a hyphen exists
base_type_candidate = map_type_upper.split('-')[0]
if base_type_candidate in PBR_COLOR_SPACE_MAP:
return PBR_COLOR_SPACE_MAP[base_type_candidate]
# 3. Handle cases like "MAP_COL" -> "COL"
# This is a simple heuristic. A more robust solution would involve access to FILE_TYPE_DEFINITIONS.
# For this script, we assume PBR_COLOR_SPACE_MAP might contain the direct standard_type.
# Example: if map_type is "MAP_DIFFUSE" and PBR_COLOR_SPACE_MAP has "DIFFUSE"
if base_type_candidate.startswith("MAP_") and len(base_type_candidate) > 4:
short_type = base_type_candidate[4:] # Get "COL" from "MAP_COL"
if short_type in PBR_COLOR_SPACE_MAP:
return PBR_COLOR_SPACE_MAP[short_type]
# Fallback if no specific rule found
# print(f" Debug: Color space for '{map_type}' (candidates: '{map_type_upper}', '{base_type_candidate}') not found in PBR_COLOR_SPACE_MAP. Using default: {DEFAULT_COLOR_SPACE}")
return DEFAULT_COLOR_SPACE
def calculate_aspect_correction_factor(image_width, image_height, aspect_string): def calculate_aspect_correction_factor(image_width, image_height, aspect_string):
""" """
@ -234,10 +256,10 @@ def reconstruct_image_path_with_fallback(asset_dir_path, asset_name, map_type, r
if primary_format: if primary_format:
try: try:
filename = IMAGE_FILENAME_PATTERN.format( filename = IMAGE_FILENAME_PATTERN.format(
asset_name=asset_name, assetname=asset_name, # Token is 'assetname'
map_type=map_type, maptype=map_type, # Token is 'maptype'
resolution=resolution, resolution=resolution, # Token is 'resolution'
format=primary_format.lower() # Ensure format is lowercase ext=primary_format.lower() # Token is 'ext'
) )
primary_path = asset_dir_path / filename primary_path = asset_dir_path / filename
if primary_path.is_file(): if primary_path.is_file():
@ -259,10 +281,10 @@ def reconstruct_image_path_with_fallback(asset_dir_path, asset_name, map_type, r
continue continue
try: try:
fallback_filename = IMAGE_FILENAME_PATTERN.format( fallback_filename = IMAGE_FILENAME_PATTERN.format(
asset_name=asset_name, assetname=asset_name, # Token is 'assetname'
map_type=map_type, maptype=map_type, # Token is 'maptype'
resolution=resolution, resolution=resolution, # Token is 'resolution'
format=ext.lower() ext=ext.lower() # Token is 'ext'
) )
fallback_path = asset_dir_path / fallback_filename fallback_path = asset_dir_path / fallback_filename
if fallback_path.is_file(): if fallback_path.is_file():
@ -493,7 +515,7 @@ def process_library(context, asset_library_root_override=None): # Add override p
asset_name = metadata.get("asset_name") asset_name = metadata.get("asset_name")
supplier_name = metadata.get("supplier_name") supplier_name = metadata.get("supplier_name")
archetype = metadata.get("archetype") archetype = metadata.get("archetype")
asset_category = metadata.get("asset_category", "Unknown") # Read asset_category instead of category asset_category = metadata.get("category", "Unknown") # Read "category" key from metadata
# Get map info from the correct keys # Get map info from the correct keys
processed_resolutions = metadata.get("processed_map_resolutions", {}) # Default to empty dict processed_resolutions = metadata.get("processed_map_resolutions", {}) # Default to empty dict
merged_resolutions = metadata.get("merged_map_resolutions", {}) # Get merged maps too merged_resolutions = metadata.get("merged_map_resolutions", {}) # Get merged maps too

View File

@ -56,7 +56,8 @@
"diffuse" "diffuse"
], ],
"standard_type": "COL", "standard_type": "COL",
"bit_depth_rule": "force_8bit" "bit_depth_rule": "force_8bit",
"is_grayscale": false
}, },
"MAP_NRM": { "MAP_NRM": {
"description": "Normal Map", "description": "Normal Map",
@ -66,7 +67,8 @@
"_normal." "_normal."
], ],
"standard_type": "NRM", "standard_type": "NRM",
"bit_depth_rule": "respect" "bit_depth_rule": "respect",
"is_grayscale": false
}, },
"MAP_METAL": { "MAP_METAL": {
"description": "Metalness Map", "description": "Metalness Map",
@ -76,7 +78,8 @@
"_met." "_met."
], ],
"standard_type": "METAL", "standard_type": "METAL",
"bit_depth_rule": "force_8bit" "bit_depth_rule": "force_8bit",
"is_grayscale": true
}, },
"MAP_ROUGH": { "MAP_ROUGH": {
"description": "Roughness Map", "description": "Roughness Map",
@ -87,7 +90,8 @@
"_gloss" "_gloss"
], ],
"standard_type": "ROUGH", "standard_type": "ROUGH",
"bit_depth_rule": "force_8bit" "bit_depth_rule": "force_8bit",
"is_grayscale": true
}, },
"MAP_AO": { "MAP_AO": {
"description": "Ambient Occlusion Map", "description": "Ambient Occlusion Map",
@ -97,7 +101,8 @@
"_ambientocclusion." "_ambientocclusion."
], ],
"standard_type": "AO", "standard_type": "AO",
"bit_depth_rule": "force_8bit" "bit_depth_rule": "force_8bit",
"is_grayscale": true
}, },
"MAP_DISP": { "MAP_DISP": {
"description": "Displacement/Height Map", "description": "Displacement/Height Map",
@ -107,7 +112,8 @@
"_height." "_height."
], ],
"standard_type": "DISP", "standard_type": "DISP",
"bit_depth_rule": "respect" "bit_depth_rule": "respect",
"is_grayscale": true
}, },
"MAP_REFL": { "MAP_REFL": {
"description": "Reflection/Specular Map", "description": "Reflection/Specular Map",
@ -117,7 +123,8 @@
"_specular." "_specular."
], ],
"standard_type": "REFL", "standard_type": "REFL",
"bit_depth_rule": "force_8bit" "bit_depth_rule": "force_8bit",
"is_grayscale": true
}, },
"MAP_SSS": { "MAP_SSS": {
"description": "Subsurface Scattering Map", "description": "Subsurface Scattering Map",
@ -127,7 +134,8 @@
"_subsurface." "_subsurface."
], ],
"standard_type": "SSS", "standard_type": "SSS",
"bit_depth_rule": "respect" "bit_depth_rule": "respect",
"is_grayscale": true
}, },
"MAP_FUZZ": { "MAP_FUZZ": {
"description": "Fuzz/Sheen Map", "description": "Fuzz/Sheen Map",
@ -137,7 +145,8 @@
"_sheen." "_sheen."
], ],
"standard_type": "FUZZ", "standard_type": "FUZZ",
"bit_depth_rule": "force_8bit" "bit_depth_rule": "force_8bit",
"is_grayscale": true
}, },
"MAP_IDMAP": { "MAP_IDMAP": {
"description": "ID Map (for masking)", "description": "ID Map (for masking)",
@ -147,7 +156,8 @@
"_matid." "_matid."
], ],
"standard_type": "IDMAP", "standard_type": "IDMAP",
"bit_depth_rule": "force_8bit" "bit_depth_rule": "force_8bit",
"is_grayscale": false
}, },
"MAP_MASK": { "MAP_MASK": {
"description": "Generic Mask Map", "description": "Generic Mask Map",
@ -156,7 +166,8 @@
"_mask." "_mask."
], ],
"standard_type": "MASK", "standard_type": "MASK",
"bit_depth_rule": "force_8bit" "bit_depth_rule": "force_8bit",
"is_grayscale": true
}, },
"MAP_IMPERFECTION": { "MAP_IMPERFECTION": {
"description": "Imperfection Map (scratches, dust)", "description": "Imperfection Map (scratches, dust)",
@ -171,7 +182,8 @@
"fingerprints" "fingerprints"
], ],
"standard_type": "IMPERFECTION", "standard_type": "IMPERFECTION",
"bit_depth_rule": "force_8bit" "bit_depth_rule": "force_8bit",
"is_grayscale": true
}, },
"MODEL": { "MODEL": {
"description": "3D Model File", "description": "3D Model File",
@ -181,7 +193,8 @@
".obj" ".obj"
], ],
"standard_type": "", "standard_type": "",
"bit_depth_rule": "" "bit_depth_rule": "",
"is_grayscale": false
}, },
"EXTRA": { "EXTRA": {
"description": "asset previews or metadata", "description": "asset previews or metadata",
@ -196,7 +209,8 @@
"thumb" "thumb"
], ],
"standard_type": "", "standard_type": "",
"bit_depth_rule": "" "bit_depth_rule": "",
"is_grayscale": false
}, },
"FILE_IGNORE": { "FILE_IGNORE": {
"description": "File to be ignored", "description": "File to be ignored",
@ -206,23 +220,11 @@
".DS_Store" ".DS_Store"
], ],
"standard_type": "", "standard_type": "",
"bit_depth_rule": "" "bit_depth_rule": "",
"is_grayscale": false
} }
}, },
"TARGET_FILENAME_PATTERN": "{base_name}_{map_type}_{resolution}.{ext}", "TARGET_FILENAME_PATTERN": "{base_name}_{map_type}_{resolution}.{ext}",
"STANDARD_MAP_TYPES": [
"COL",
"NRM",
"ROUGH",
"METAL",
"AO",
"DISP",
"REFL",
"SSS",
"FUZZ",
"IDMAP",
"MASK"
],
"RESPECT_VARIANT_MAP_TYPES": [ "RESPECT_VARIANT_MAP_TYPES": [
"COL" "COL"
], ],

View File

@ -271,6 +271,16 @@ class Configuration:
raise ConfigurationError(f"Preset '{self.preset_name}': Rule at index {index} in 'map_type_mapping' must be a dictionary.") raise ConfigurationError(f"Preset '{self.preset_name}': Rule at index {index} in 'map_type_mapping' must be a dictionary.")
if 'target_type' not in rule or not isinstance(rule['target_type'], str): if 'target_type' not in rule or not isinstance(rule['target_type'], str):
raise ConfigurationError(f"Preset '{self.preset_name}': Rule at index {index} in 'map_type_mapping' is missing 'target_type' string.") raise ConfigurationError(f"Preset '{self.preset_name}': Rule at index {index} in 'map_type_mapping' is missing 'target_type' string.")
# Validate target_type against FILE_TYPE_DEFINITIONS keys
valid_file_type_keys = self._core_settings.get('FILE_TYPE_DEFINITIONS', {}).keys()
if rule['target_type'] not in valid_file_type_keys:
raise ConfigurationError(
f"Preset '{self.preset_name}': Rule at index {index} in 'map_type_mapping' "
f"has an invalid 'target_type': '{rule['target_type']}'. "
f"Must be one of {list(valid_file_type_keys)}."
)
if 'keywords' not in rule or not isinstance(rule['keywords'], list): if 'keywords' not in rule or not isinstance(rule['keywords'], list):
raise ConfigurationError(f"Preset '{self.preset_name}': Rule at index {index} in 'map_type_mapping' is missing 'keywords' list.") raise ConfigurationError(f"Preset '{self.preset_name}': Rule at index {index} in 'map_type_mapping' is missing 'keywords' list.")
for kw_index, keyword in enumerate(rule['keywords']): for kw_index, keyword in enumerate(rule['keywords']):
@ -289,10 +299,20 @@ class Configuration:
# --- End: Added validation for new output patterns --- # --- End: Added validation for new output patterns ---
if not isinstance(self._core_settings.get('IMAGE_RESOLUTIONS'), dict): if not isinstance(self._core_settings.get('IMAGE_RESOLUTIONS'), dict):
raise ConfigurationError("Core config 'IMAGE_RESOLUTIONS' must be a dictionary.") raise ConfigurationError("Core config 'IMAGE_RESOLUTIONS' must be a dictionary.")
if not isinstance(self._core_settings.get('STANDARD_MAP_TYPES'), list):
raise ConfigurationError("Core config 'STANDARD_MAP_TYPES' must be a list.")
# LLM settings validation (check if keys exist if the file was loaded) # Validate DEFAULT_ASSET_CATEGORY
valid_asset_type_keys = self._core_settings.get('ASSET_TYPE_DEFINITIONS', {}).keys()
default_asset_category_value = self._core_settings.get('DEFAULT_ASSET_CATEGORY')
if not default_asset_category_value:
raise ConfigurationError("Core config 'DEFAULT_ASSET_CATEGORY' is missing.")
if default_asset_category_value not in valid_asset_type_keys:
raise ConfigurationError(
f"Core config 'DEFAULT_ASSET_CATEGORY' ('{default_asset_category_value}') "
f"is not a valid key in ASSET_TYPE_DEFINITIONS. "
f"Must be one of {list(valid_asset_type_keys)}."
)
# LLM settings validation (check if keys exist if the file was loaded)
if self._llm_settings: # Only validate if LLM settings were loaded if self._llm_settings: # Only validate if LLM settings were loaded
required_llm_keys = [ # Indent this block required_llm_keys = [ # Indent this block
"llm_predictor_examples", "llm_endpoint_url", "llm_api_key", "llm_predictor_examples", "llm_endpoint_url", "llm_api_key",
@ -319,8 +339,8 @@ class Configuration:
@property @property
def default_asset_category(self) -> str: def default_asset_category(self) -> str:
"""Gets the default asset category from core settings.""" """Gets the default asset category from core settings."""
# Provide a fallback default just in case it's missing from config.py # Fallback should align with a valid key, and validation should catch issues.
return self._core_settings.get('DEFAULT_ASSET_CATEGORY', 'Texture') return self._core_settings.get('DEFAULT_ASSET_CATEGORY', 'Surface')
@property @property
def target_filename_pattern(self) -> str: def target_filename_pattern(self) -> str:
@ -422,19 +442,44 @@ class Configuration:
"""Gets the list of map types that must always be saved losslessly.""" """Gets the list of map types that must always be saved losslessly."""
return self._core_settings.get('FORCE_LOSSLESS_MAP_TYPES', []) return self._core_settings.get('FORCE_LOSSLESS_MAP_TYPES', [])
def get_bit_depth_rule(self, map_type: str) -> str: def get_bit_depth_rule(self, map_type_input: str) -> str:
"""Gets the bit depth rule ('respect' or 'force_8bit') for a given standard map type.""" """
# Access the FILE_TYPE_DEFINITIONS from core settings Gets the bit depth rule ('respect', 'force_8bit', 'force_16bit') for a given map type identifier.
file_type_definitions = self._core_settings.get('FILE_TYPE_DEFINITIONS', {}) The map_type_input can be an FTD key (e.g., "MAP_COL") or a suffixed FTD key (e.g., "MAP_COL-1").
"""
if not self._core_settings or 'FILE_TYPE_DEFINITIONS' not in self._core_settings:
log.warning("FILE_TYPE_DEFINITIONS not found in core settings. Cannot determine bit depth rule.")
return "respect"
# Iterate through definitions to find the matching map type file_type_definitions = self._core_settings['FILE_TYPE_DEFINITIONS']
for definition in file_type_definitions.values():
if definition.get('standard_type') == map_type: # 1. Try direct match with map_type_input as FTD key
# Found the definition, check for 'bit_depth_rule' definition = file_type_definitions.get(map_type_input)
return definition.get('bit_depth_rule', 'respect') if definition:
rule = definition.get('bit_depth_rule')
if rule in ['respect', 'force_8bit', 'force_16bit']:
return rule
else:
log.warning(f"FTD key '{map_type_input}' found, but 'bit_depth_rule' is missing or invalid: '{rule}'. Defaulting to 'respect'.")
return "respect"
# If map type definition not found, return default rule # 2. Try to derive base FTD key by stripping common variant suffixes
return 'respect' # Regex to remove trailing suffixes like -<digits>, -<alphanum>, _<alphanum>
base_ftd_key_candidate = re.sub(r"(-[\w\d]+|_[\w\d]+)$", "", map_type_input)
if base_ftd_key_candidate != map_type_input: # Check if stripping occurred
definition = file_type_definitions.get(base_ftd_key_candidate)
if definition:
rule = definition.get('bit_depth_rule')
if rule in ['respect', 'force_8bit', 'force_16bit']:
log.debug(f"Derived base FTD key '{base_ftd_key_candidate}' from '{map_type_input}' and found bit depth rule: {rule}")
return rule
else:
log.warning(f"Derived base FTD key '{base_ftd_key_candidate}' from '{map_type_input}', but 'bit_depth_rule' is missing/invalid: '{rule}'. Defaulting to 'respect'.")
return "respect"
# If no match found after trying direct and derived keys
log.warning(f"Map type identifier '{map_type_input}' (or its derived base) not found in FILE_TYPE_DEFINITIONS. Defaulting bit depth rule to 'respect'.")
return "respect"
def get_16bit_output_formats(self) -> tuple[str, str]: def get_16bit_output_formats(self) -> tuple[str, str]:
"""Gets the primary and fallback format names for 16-bit output.""" """Gets the primary and fallback format names for 16-bit output."""
@ -446,8 +491,23 @@ class Configuration:
"""Gets the format name for 8-bit output.""" """Gets the format name for 8-bit output."""
return self._core_settings.get('OUTPUT_FORMAT_8BIT', 'png').lower() return self._core_settings.get('OUTPUT_FORMAT_8BIT', 'png').lower()
# --- LLM Prompt Data Accessors --- def get_standard_map_type_aliases(self) -> list[str]:
"""
Derives a sorted list of unique standard map type aliases
from FILE_TYPE_DEFINITIONS.
"""
aliases = set()
file_type_definitions = self._core_settings.get('FILE_TYPE_DEFINITIONS', {})
for _key, definition in file_type_definitions.items():
if isinstance(definition, dict):
standard_type = definition.get('standard_type')
# Ensure standard_type is a non-empty string
if standard_type and isinstance(standard_type, str) and standard_type.strip():
aliases.add(standard_type)
return sorted(list(aliases))
# --- LLM Prompt Data Accessors ---
def get_asset_type_definitions(self) -> dict: def get_asset_type_definitions(self) -> dict:
"""Returns the ASSET_TYPE_DEFINITIONS dictionary from core settings.""" """Returns the ASSET_TYPE_DEFINITIONS dictionary from core settings."""
return self._core_settings.get('ASSET_TYPE_DEFINITIONS', {}) return self._core_settings.get('ASSET_TYPE_DEFINITIONS', {})

View File

@ -242,7 +242,6 @@ class ConfigEditorDialog(QDialog):
# Clear potentially lingering widget references for this tab # Clear potentially lingering widget references for this tab
self.widgets.pop("TARGET_FILENAME_PATTERN", None) self.widgets.pop("TARGET_FILENAME_PATTERN", None)
self.widgets.pop("STANDARD_MAP_TYPES_LIST", None)
self.widgets.pop("RESPECT_VARIANT_MAP_TYPES", None) self.widgets.pop("RESPECT_VARIANT_MAP_TYPES", None)
self.widgets.pop("ASPECT_RATIO_DECIMALS", None) self.widgets.pop("ASPECT_RATIO_DECIMALS", None)
@ -276,24 +275,7 @@ class ConfigEditorDialog(QDialog):
self.widgets["ASPECT_RATIO_DECIMALS"] = aspect_ratio_spinbox self.widgets["ASPECT_RATIO_DECIMALS"] = aspect_ratio_spinbox
main_tab_layout.addLayout(form_layout) main_tab_layout.addLayout(form_layout)
# 4. STANDARD_MAP_TYPES: QListWidget + Add/Remove Buttons
standard_maps_layout = QVBoxLayout()
standard_maps_label = QLabel("Standard Map Types:")
standard_maps_layout.addWidget(standard_maps_label)
standard_maps_button_layout = QHBoxLayout()
add_button = QPushButton("Add")
remove_button = QPushButton("Remove")
# TODO: Connect add/remove buttons signals
standard_maps_button_layout.addWidget(add_button)
standard_maps_button_layout.addWidget(remove_button)
standard_maps_button_layout.addStretch() # Push buttons left
standard_maps_layout.addLayout(standard_maps_button_layout)
main_tab_layout.addLayout(standard_maps_layout)
# Add the main layout to the tab's provided layout # Add the main layout to the tab's provided layout
layout.addLayout(main_tab_layout) layout.addLayout(main_tab_layout)
layout.addStretch() # Keep stretch at the end of the tab's main layout layout.addStretch() # Keep stretch at the end of the tab's main layout
@ -900,7 +882,7 @@ class ConfigEditorDialog(QDialog):
# This is the final key, update the value # This is the final key, update the value
if isinstance(widget, QLineEdit): if isinstance(widget, QLineEdit):
# Handle simple lists displayed as comma-separated strings # Handle simple lists displayed as comma-separated strings
if key in ["STANDARD_MAP_TYPES", "RESPECT_VARIANT_MAP_TYPES"]: if key == "RESPECT_VARIANT_MAP_TYPES":
current_dict[k] = [item.strip() for item in widget.text().split(',') if item.strip()] current_dict[k] = [item.strip() for item in widget.text().split(',') if item.strip()]
else: else:
current_dict[k] = widget.text() current_dict[k] = widget.text()
@ -1039,7 +1021,7 @@ class ConfigEditorDialog(QDialog):
widget = self.widgets[key] widget = self.widgets[key]
if isinstance(widget, QLineEdit): if isinstance(widget, QLineEdit):
# Handle simple lists displayed as comma-separated strings # Handle simple lists displayed as comma-separated strings
if key in ["STANDARD_MAP_TYPES", "RESPECT_VARIANT_MAP_TYPES"] and isinstance(value, list): if key == "RESPECT_VARIANT_MAP_TYPES" and isinstance(value, list):
widget.setText(", ".join(map(str, value))) widget.setText(", ".join(map(str, value)))
elif isinstance(value, (str, int, float, bool)): # Also handle cases where simple types might be in QLineEdit elif isinstance(value, (str, int, float, bool)): # Also handle cases where simple types might be in QLineEdit
widget.setText(str(value)) widget.setText(str(value))

View File

@ -214,29 +214,18 @@ class SupplierSearchDelegate(QStyledItemDelegate):
class ItemTypeSearchDelegate(QStyledItemDelegate): class ItemTypeSearchDelegate(QStyledItemDelegate):
""" """
Delegate for editing item types using a QLineEdit with auto-completion. Delegate for editing item types using a QLineEdit with auto-completion.
Loads known item types from the UnifiedViewModel's cached keys. Loads known item types from the provided list.
""" """
def __init__(self, parent=None): def __init__(self, item_type_keys: list[str] | None = None, parent=None):
super().__init__(parent) super().__init__(parent)
# No persistent list needed here, suggestions come from the model self.item_type_keys = item_type_keys if item_type_keys else []
log.debug(f"ItemTypeSearchDelegate initialized with {len(self.item_type_keys)} keys: {self.item_type_keys}")
def createEditor(self, parent, option, index: QModelIndex): def createEditor(self, parent, option, index: QModelIndex):
"""Creates the QLineEdit editor with a QCompleter.""" """Creates the QLineEdit editor with a QCompleter."""
editor = QLineEdit(parent) editor = QLineEdit(parent)
model = index.model() # Use the keys passed during initialization
item_keys = [] completer = QCompleter(self.item_type_keys, editor)
# Get keys directly from the UnifiedViewModel
if hasattr(model, '_file_type_keys'):
try:
item_keys = model._file_type_keys # Use cached keys
except Exception as e:
log.error(f"Error getting _file_type_keys from model in ItemTypeSearchDelegate: {e}")
item_keys = []
else:
log.warning("ItemTypeSearchDelegate: Model is missing _file_type_keys attribute. Suggestions will be empty.")
completer = QCompleter(item_keys, editor)
completer.setCaseSensitivity(Qt.CaseInsensitive) completer.setCaseSensitivity(Qt.CaseInsensitive)
completer.setFilterMode(Qt.MatchContains) completer.setFilterMode(Qt.MatchContains)
completer.setCompletionMode(QCompleter.PopupCompletion) completer.setCompletionMode(QCompleter.PopupCompletion)

View File

@ -71,16 +71,18 @@ class MainPanelWidget(QWidget):
# Notify when Blender settings change # Notify when Blender settings change
blender_settings_changed = Signal(bool, str, str) # enabled, ng_path, mat_path blender_settings_changed = Signal(bool, str, str) # enabled, ng_path, mat_path
def __init__(self, unified_model: UnifiedViewModel, parent=None): def __init__(self, unified_model: UnifiedViewModel, parent=None, file_type_keys: list[str] | None = None):
""" """
Initializes the MainPanelWidget. Initializes the MainPanelWidget.
Args: Args:
unified_model: The shared UnifiedViewModel instance. unified_model: The shared UnifiedViewModel instance.
parent: The parent widget. parent: The parent widget.
file_type_keys: A list of available file type names (keys from FILE_TYPE_DEFINITIONS).
""" """
super().__init__(parent) super().__init__(parent)
self.unified_model = unified_model self.unified_model = unified_model
self.file_type_keys = file_type_keys if file_type_keys else []
self.llm_processing_active = False # Track if LLM is running (set by MainWindow) self.llm_processing_active = False # Track if LLM is running (set by MainWindow)
# Get project root for resolving default paths if needed here # Get project root for resolving default paths if needed here
@ -137,7 +139,8 @@ class MainPanelWidget(QWidget):
# TODO: Revisit ComboBoxDelegate dependency # TODO: Revisit ComboBoxDelegate dependency
comboBoxDelegate = ComboBoxDelegate(self) # Pass only parent (self) comboBoxDelegate = ComboBoxDelegate(self) # Pass only parent (self)
supplierSearchDelegate = SupplierSearchDelegate(self) # Pass parent supplierSearchDelegate = SupplierSearchDelegate(self) # Pass parent
itemTypeSearchDelegate = ItemTypeSearchDelegate(self) # Instantiate new delegate # Pass file_type_keys to ItemTypeSearchDelegate
itemTypeSearchDelegate = ItemTypeSearchDelegate(self.file_type_keys, self)
# Set Delegates for Columns # Set Delegates for Columns
self.unified_view.setItemDelegateForColumn(UnifiedViewModel.COL_SUPPLIER, supplierSearchDelegate) self.unified_view.setItemDelegateForColumn(UnifiedViewModel.COL_SUPPLIER, supplierSearchDelegate)

View File

@ -161,8 +161,22 @@ class MainWindow(QMainWindow):
# --- Create Panels --- # --- Create Panels ---
self.preset_editor_widget = PresetEditorWidget() self.preset_editor_widget = PresetEditorWidget()
self.llm_editor_widget = LLMEditorWidget() # Instantiate the LLM editor self.llm_editor_widget = LLMEditorWidget() # Instantiate the LLM editor
# Instantiate MainPanelWidget, passing the model and self (MainWindow) for context
self.main_panel_widget = MainPanelWidget(self.unified_model, self) # --- Load File Type Definitions for Rule Editor ---
file_type_keys = []
try:
# Attempt to load from base config first
base_cfg_data = load_base_config()
if base_cfg_data and "FILE_TYPE_DEFINITIONS" in base_cfg_data:
file_type_keys = list(base_cfg_data["FILE_TYPE_DEFINITIONS"].keys())
log.info(f"Loaded {len(file_type_keys)} FILE_TYPE_DEFINITIONS keys for RuleEditor.")
else:
log.warning("FILE_TYPE_DEFINITIONS not found in base_config. RuleEditor item_type dropdown might be empty.")
except Exception as e:
log.exception(f"Error loading FILE_TYPE_DEFINITIONS for RuleEditor: {e}")
# Instantiate MainPanelWidget, passing the model, self (MainWindow) for context, and file_type_keys
self.main_panel_widget = MainPanelWidget(self.unified_model, self, file_type_keys=file_type_keys)
self.log_console = LogConsoleWidget(self) self.log_console = LogConsoleWidget(self)
# --- Create Left Pane with Static Selector and Stacked Editor --- # --- Create Left Pane with Static Selector and Stacked Editor ---

View File

@ -407,24 +407,95 @@ class RuleBasedPredictionHandler(BasePredictionHandler):
preset_name=preset_name preset_name=preset_name
) )
asset_rules = [] asset_rules = []
asset_type_definitions = config._core_settings.get('ASSET_TYPE_DEFINITIONS', {}) # asset_type_definitions = config._core_settings.get('ASSET_TYPE_DEFINITIONS', {}) # Use accessor
file_type_definitions = config._core_settings.get('FILE_TYPE_DEFINITIONS', {}) file_type_definitions = config._core_settings.get('FILE_TYPE_DEFINITIONS', {})
for asset_name, files_info in classified_assets.items(): for asset_name, files_info in classified_assets.items():
if self._is_cancelled: raise RuntimeError("Prediction cancelled during hierarchy building (assets).") if self._is_cancelled: raise RuntimeError("Prediction cancelled during hierarchy building (assets).")
if not files_info: continue if not files_info: continue
item_types_in_asset = {f_info['item_type'] for f_info in files_info} asset_category_rules = config.asset_category_rules
predicted_asset_type = "Surface" asset_type_definitions = config.get_asset_type_definitions() # Use new accessor
material_indicators = {"MAP_COL", "MAP_NRM", "MAP_ROUGH", "MAP_METAL", "MAP_AO", "MAP_DISP", "COL", "NRM", "ROUGH", "METAL", "AO", "DISP"} asset_type_keys = list(asset_type_definitions.keys())
if any(it in material_indicators for it in item_types_in_asset if it not in ["EXTRA", "FILE_IGNORE"]):
predicted_asset_type = "Surface"
if asset_type_definitions and predicted_asset_type not in asset_type_definitions: # Initialize predicted_asset_type using the validated default
log.warning(f"Predicted AssetType '{predicted_asset_type}' for asset '{asset_name}' is not in ASSET_TYPE_DEFINITIONS. Falling back.") predicted_asset_type = config.default_asset_category
default_type = config.default_asset_category log.debug(f"Asset '{asset_name}': Initial predicted_asset_type set to default: '{predicted_asset_type}'.")
if default_type in asset_type_definitions: predicted_asset_type = default_type
elif asset_type_definitions: predicted_asset_type = list(asset_type_definitions.keys())[0] # 1. Check asset_category_rules from preset
determined_by_rule = False
# Check for Model type based on file patterns
if "Model" in asset_type_keys:
model_patterns_regex = config.compiled_model_regex # Already compiled
for f_info in files_info:
# Only consider files not marked as EXTRA or FILE_IGNORE for model classification
if f_info['item_type'] in ["EXTRA", "FILE_IGNORE"]:
continue
file_path_obj = Path(f_info['file_path'])
for pattern_re in model_patterns_regex:
if pattern_re.search(file_path_obj.name):
predicted_asset_type = "Model"
determined_by_rule = True
log.debug(f"Asset '{asset_name}' classified as 'Model' due to file '{file_path_obj.name}' matching pattern '{pattern_re.pattern}'.")
break
if determined_by_rule:
break
# Check for Decal type based on keywords in asset name (if not already Model)
if not determined_by_rule and "Decal" in asset_type_keys:
decal_keywords = asset_category_rules.get('decal_keywords', [])
for keyword in decal_keywords:
# Ensure keyword is a string before trying to escape it
if isinstance(keyword, str) and keyword: # Added check for non-empty string
try:
if re.search(r'\b' + re.escape(keyword) + r'\b', asset_name, re.IGNORECASE): # Match whole word
predicted_asset_type = "Decal"
determined_by_rule = True
log.debug(f"Asset '{asset_name}' classified as 'Decal' due to keyword '{keyword}'.")
break
except re.error as e_re:
log.warning(f"Regex error with decal_keyword '{keyword}': {e_re}")
if determined_by_rule:
pass # Already logged if Decal
# 2. If not determined by specific rules, check for Surface (if not Model/Decal by rule)
if not determined_by_rule and predicted_asset_type == config.default_asset_category and "Surface" in asset_type_keys:
item_types_in_asset = {f_info['item_type'] for f_info in files_info}
# Ensure we are checking against standard map types from FILE_TYPE_DEFINITIONS
# This check is primarily for PBR texture sets.
material_indicators = {
ft_key for ft_key, ft_def in config.get_file_type_definitions_with_examples().items()
if ft_def.get('standard_type') and ft_def.get('standard_type') not in ["", "EXTRA", "FILE_IGNORE", "MODEL"]
}
# Add common direct standard types as well for robustness
material_indicators.update({"COL", "NRM", "ROUGH", "METAL", "AO", "DISP"})
has_material_map = False
for item_type in item_types_in_asset:
# Check if the item_type itself is a material indicator or its standard_type is
if item_type in material_indicators:
has_material_map = True
break
# Check standard type if item_type is a key in FILE_TYPE_DEFINITIONS
item_def = config.get_file_type_definitions_with_examples().get(item_type)
if item_def and item_def.get('standard_type') in material_indicators:
has_material_map = True
break
if has_material_map:
predicted_asset_type = "Surface"
log.debug(f"Asset '{asset_name}' classified as 'Surface' due to material indicators.")
# 3. Final validation: Ensure predicted_asset_type is a valid key.
# config.default_asset_category is already validated to be a key.
if predicted_asset_type not in asset_type_keys:
log.warning(f"Derived AssetType '{predicted_asset_type}' for asset '{asset_name}' is not in ASSET_TYPE_DEFINITIONS. "
f"Falling back to default: '{config.default_asset_category}'.")
predicted_asset_type = config.default_asset_category
# This case should ideally not be hit if logic above correctly uses asset_type_keys
# and default_asset_category is valid.
asset_rule = AssetRule(asset_name=asset_name, asset_type=predicted_asset_type) asset_rule = AssetRule(asset_name=asset_name, asset_type=predicted_asset_type)
file_rules = [] file_rules = []

View File

@ -20,6 +20,8 @@ script_dir = Path(__file__).parent
project_root = script_dir.parent project_root = script_dir.parent
PRESETS_DIR = project_root / "Presets" # Corrected path PRESETS_DIR = project_root / "Presets" # Corrected path
TEMPLATE_PATH = PRESETS_DIR / "_template.json" TEMPLATE_PATH = PRESETS_DIR / "_template.json"
APP_SETTINGS_PATH_LOCAL = project_root / "config" / "app_settings.json"
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
@ -49,6 +51,7 @@ class PresetEditorWidget(QWidget):
self._init_ui() self._init_ui()
# --- Initial State --- # --- Initial State ---
self._ftd_keys = self._get_file_type_definition_keys() # Load FTD keys
self._clear_editor() # Clear/disable editor fields initially self._clear_editor() # Clear/disable editor fields initially
self._set_editor_enabled(False) # Disable editor initially self._set_editor_enabled(False) # Disable editor initially
self.populate_presets() # Populate preset list self.populate_presets() # Populate preset list
@ -56,6 +59,24 @@ class PresetEditorWidget(QWidget):
# --- Connect Editor Signals --- # --- Connect Editor Signals ---
self._connect_editor_change_signals() self._connect_editor_change_signals()
def _get_file_type_definition_keys(self) -> list[str]:
"""Loads FILE_TYPE_DEFINITIONS keys from app_settings.json."""
keys = []
try:
if APP_SETTINGS_PATH_LOCAL.is_file():
with open(APP_SETTINGS_PATH_LOCAL, 'r', encoding='utf-8') as f:
settings = json.load(f)
ftd = settings.get("FILE_TYPE_DEFINITIONS", {})
keys = list(ftd.keys())
log.debug(f"Successfully loaded {len(keys)} FILE_TYPE_DEFINITIONS keys.")
else:
log.error(f"app_settings.json not found at {APP_SETTINGS_PATH_LOCAL} for PresetEditorWidget.")
except json.JSONDecodeError as e:
log.error(f"Failed to parse app_settings.json in PresetEditorWidget: {e}")
except Exception as e:
log.error(f"Error loading FILE_TYPE_DEFINITIONS keys in PresetEditorWidget: {e}")
return keys
def _init_ui(self): def _init_ui(self):
"""Initializes the UI elements for the preset editor.""" """Initializes the UI elements for the preset editor."""
main_layout = QVBoxLayout(self) main_layout = QVBoxLayout(self)
@ -306,7 +327,21 @@ class PresetEditorWidget(QWidget):
"""Adds an empty row to the specified table widget in the editor.""" """Adds an empty row to the specified table widget in the editor."""
row_count = table_widget.rowCount() row_count = table_widget.rowCount()
table_widget.insertRow(row_count) table_widget.insertRow(row_count)
for col in range(table_widget.columnCount()): table_widget.setItem(row_count, col, QTableWidgetItem(""))
if table_widget == self.editor_table_map_type_mapping:
# Column 0: Standard Type (QComboBox)
combo_box = QComboBox()
if self._ftd_keys:
combo_box.addItems(self._ftd_keys)
else:
log.warning("FILE_TYPE_DEFINITIONS keys not available for ComboBox in map_type_mapping.")
combo_box.currentIndexChanged.connect(self._mark_editor_unsaved) # Mark unsaved on change
table_widget.setCellWidget(row_count, 0, combo_box)
# Column 1: Input Keywords (QTableWidgetItem)
table_widget.setItem(row_count, 1, QTableWidgetItem(""))
else: # For other tables
for col in range(table_widget.columnCount()):
table_widget.setItem(row_count, col, QTableWidgetItem(""))
self._mark_editor_unsaved() self._mark_editor_unsaved()
def _editor_remove_table_row(self, table_widget: QTableWidget): def _editor_remove_table_row(self, table_widget: QTableWidget):
@ -409,18 +444,36 @@ class PresetEditorWidget(QWidget):
self.editor_table_bit_depth_variants.setItem(i, 1, QTableWidgetItem(pattern)) self.editor_table_bit_depth_variants.setItem(i, 1, QTableWidgetItem(pattern))
self.editor_list_extra_patterns.clear() self.editor_list_extra_patterns.clear()
self.editor_list_extra_patterns.addItems(preset_data.get("move_to_extra_patterns", [])) self.editor_list_extra_patterns.addItems(preset_data.get("move_to_extra_patterns", []))
self.editor_table_map_type_mapping.setRowCount(0)
self.editor_table_map_type_mapping.setRowCount(0) # Clear before populating
map_mappings = preset_data.get("map_type_mapping", []) map_mappings = preset_data.get("map_type_mapping", [])
for i, mapping_dict in enumerate(map_mappings): for i, mapping_dict in enumerate(map_mappings):
if isinstance(mapping_dict, dict) and "target_type" in mapping_dict and "keywords" in mapping_dict: if isinstance(mapping_dict, dict) and "target_type" in mapping_dict and "keywords" in mapping_dict:
std_type = mapping_dict["target_type"] std_type = mapping_dict["target_type"]
keywords = mapping_dict["keywords"] keywords = mapping_dict["keywords"]
self.editor_table_map_type_mapping.insertRow(i) self.editor_table_map_type_mapping.insertRow(i)
self.editor_table_map_type_mapping.setItem(i, 0, QTableWidgetItem(std_type))
# Column 0: Standard Type (QComboBox)
combo_box = QComboBox()
if self._ftd_keys:
combo_box.addItems(self._ftd_keys)
if std_type in self._ftd_keys:
combo_box.setCurrentText(std_type)
else:
log.warning(f"Preset '{preset_data.get('preset_name', 'Unknown')}': target_type '{std_type}' not found in FILE_TYPE_DEFINITIONS. Selecting first available.")
if self._ftd_keys: combo_box.setCurrentIndex(0)
else:
log.warning("FILE_TYPE_DEFINITIONS keys not available for ComboBox in map_type_mapping during population.")
combo_box.currentIndexChanged.connect(self._mark_editor_unsaved) # Connect signal
self.editor_table_map_type_mapping.setCellWidget(i, 0, combo_box)
# Column 1: Input Keywords (QTableWidgetItem)
keywords_str = [str(k) for k in keywords if isinstance(k, str)] keywords_str = [str(k) for k in keywords if isinstance(k, str)]
self.editor_table_map_type_mapping.setItem(i, 1, QTableWidgetItem(", ".join(keywords_str))) self.editor_table_map_type_mapping.setItem(i, 1, QTableWidgetItem(", ".join(keywords_str)))
else: else:
log.warning(f"Skipping invalid map_type_mapping item during editor population: {mapping_dict}") log.warning(f"Skipping invalid map_type_mapping item during editor population: {mapping_dict}")
category_rules = preset_data.get("asset_category_rules", {}) category_rules = preset_data.get("asset_category_rules", {})
self.editor_list_model_patterns.clear() self.editor_list_model_patterns.clear()
self.editor_list_model_patterns.addItems(category_rules.get("model_patterns", [])) self.editor_list_model_patterns.addItems(category_rules.get("model_patterns", []))
@ -543,18 +596,34 @@ class PresetEditorWidget(QWidget):
for r in range(self.editor_table_bit_depth_variants.rowCount()) if self.editor_table_bit_depth_variants.item(r, 0) and self.editor_table_bit_depth_variants.item(r, 1)} for r in range(self.editor_table_bit_depth_variants.rowCount()) if self.editor_table_bit_depth_variants.item(r, 0) and self.editor_table_bit_depth_variants.item(r, 1)}
preset_data["source_naming"] = naming_data preset_data["source_naming"] = naming_data
preset_data["move_to_extra_patterns"] = [self.editor_list_extra_patterns.item(i).text() for i in range(self.editor_list_extra_patterns.count())] preset_data["move_to_extra_patterns"] = [self.editor_list_extra_patterns.item(i).text() for i in range(self.editor_list_extra_patterns.count())]
map_mappings = [] map_mappings = []
for r in range(self.editor_table_map_type_mapping.rowCount()): for r in range(self.editor_table_map_type_mapping.rowCount()):
type_item = self.editor_table_map_type_mapping.item(r, 0) target_type_widget = self.editor_table_map_type_mapping.cellWidget(r, 0)
keywords_item = self.editor_table_map_type_mapping.item(r, 1) keywords_item = self.editor_table_map_type_mapping.item(r, 1)
if type_item and type_item.text() and keywords_item and keywords_item.text():
target_type = type_item.text().strip() target_type = ""
if isinstance(target_type_widget, QComboBox):
target_type = target_type_widget.currentText()
elif self.editor_table_map_type_mapping.item(r, 0): # Fallback if item is not a widget
target_type_item = self.editor_table_map_type_mapping.item(r, 0)
if target_type_item:
target_type = target_type_item.text().strip()
if target_type and keywords_item and keywords_item.text():
keywords = [k.strip() for k in keywords_item.text().split(',') if k.strip()] keywords = [k.strip() for k in keywords_item.text().split(',') if k.strip()]
if target_type and keywords: if keywords: # Ensure keywords list is not empty after stripping
map_mappings.append({"target_type": target_type, "keywords": keywords}) map_mappings.append({"target_type": target_type, "keywords": keywords})
else: log.warning(f"Skipping row {r} in map type mapping table due to empty target type or keywords.") else:
else: log.warning(f"Skipping row {r} in map type mapping table due to missing items.") log.warning(f"Skipping row {r} in map type mapping table due to empty keywords after processing for target_type '{target_type}'.")
else:
# Log if target_type is empty or keywords_item is problematic
if not target_type:
log.warning(f"Skipping row {r} in map type mapping table due to empty target_type.")
if not (keywords_item and keywords_item.text()):
log.warning(f"Skipping row {r} in map type mapping table for target_type '{target_type}' due to missing or empty keywords item.")
preset_data["map_type_mapping"] = map_mappings preset_data["map_type_mapping"] = map_mappings
category_rules = {} category_rules = {}
category_rules["model_patterns"] = [self.editor_list_model_patterns.item(i).text() for i in range(self.editor_list_model_patterns.count())] category_rules["model_patterns"] = [self.editor_list_model_patterns.item(i).text() for i in range(self.editor_list_model_patterns.count())]
category_rules["decal_keywords"] = [self.editor_list_decal_keywords.item(i).text() for i in range(self.editor_list_decal_keywords.count())] category_rules["decal_keywords"] = [self.editor_list_decal_keywords.item(i).text() for i in range(self.editor_list_decal_keywords.count())]

View File

@ -14,16 +14,18 @@ class RuleEditorWidget(QWidget):
""" """
rule_updated = Signal(object) # Signal emitted when a rule is updated rule_updated = Signal(object) # Signal emitted when a rule is updated
def __init__(self, asset_types: list[str] | None = None, parent=None): def __init__(self, asset_types: list[str] | None = None, file_types: list[str] | None = None, parent=None):
""" """
Initializes the RuleEditorWidget. Initializes the RuleEditorWidget.
Args: Args:
asset_types (list[str] | None): A list of available asset type names. Defaults to None. asset_types (list[str] | None): A list of available asset type names. Defaults to None.
file_types (list[str] | None): A list of available file type names (keys from FILE_TYPE_DEFINITIONS). Defaults to None.
parent: The parent widget. parent: The parent widget.
""" """
super().__init__(parent) super().__init__(parent)
self.asset_types = asset_types if asset_types else [] # Store asset types self.asset_types = asset_types if asset_types else [] # Store asset types
self.file_types = file_types if file_types else [] # Store file types
self.current_rule_type = None self.current_rule_type = None
self.current_rule_object = None self.current_rule_object = None
@ -71,12 +73,26 @@ class RuleEditorWidget(QWidget):
Creates an appropriate editor widget based on the attribute type. Creates an appropriate editor widget based on the attribute type.
""" """
# --- Special Handling for Asset Type Dropdown --- # --- Special Handling for Asset Type Dropdown ---
if self.current_rule_type == 'AssetRule' and attr_name == 'asset_type' and self.asset_types: if self.current_rule_type == 'AssetRule' and attr_name in ('asset_type', 'asset_type_override') and self.asset_types:
widget = QComboBox() widget = QComboBox()
widget.addItems(self.asset_types) widget.addItems(self.asset_types)
if attr_value in self.asset_types: # Handle None case for override: if None, don't select anything or select a placeholder
if attr_value is None and attr_name == 'asset_type_override':
# Optionally add a placeholder like "<None>" or "<Default>"
# widget.insertItem(0, "<Default>") # Example placeholder
widget.setCurrentIndex(-1) # No selection or placeholder
elif attr_value in self.asset_types:
widget.setCurrentText(attr_value) widget.setCurrentText(attr_value)
elif self.asset_types: # Select first item if current value is invalid elif self.asset_types: # Select first item if current value is invalid (and not None override)
widget.setCurrentIndex(0)
return widget
# --- Special Handling for FileRule item_type and item_type_override ---
elif self.current_rule_type == 'FileRule' and attr_name in ('item_type', 'item_type_override') and self.file_types:
widget = QComboBox()
widget.addItems(self.file_types)
if attr_value in self.file_types:
widget.setCurrentText(attr_value)
elif self.file_types: # Select first item if current value is invalid
widget.setCurrentIndex(0) widget.setCurrentIndex(0)
return widget return widget
# --- Standard Type Handling --- # --- Standard Type Handling ---
@ -187,7 +203,8 @@ if __name__ == '__main__':
# Example usage: Provide asset types during instantiation # Example usage: Provide asset types during instantiation
asset_types_from_config = ["Surface", "Model", "Decal", "Atlas", "UtilityMap"] # Example list asset_types_from_config = ["Surface", "Model", "Decal", "Atlas", "UtilityMap"] # Example list
editor = RuleEditorWidget(asset_types=asset_types_from_config) file_types_from_config = ["MAP_COL", "MAP_NRM", "MAP_METAL", "MAP_ROUGH", "MAP_AO", "MAP_DISP", "MAP_REFL", "MAP_SSS", "MAP_FUZZ", "MAP_IDMAP", "MAP_MASK", "MAP_IMPERFECTION", "MODEL", "EXTRA", "FILE_IGNORE"] # Example list
editor = RuleEditorWidget(asset_types=asset_types_from_config, file_types=file_types_from_config)
# Test loading different rule types # Test loading different rule types
source_rule = SourceRule() source_rule = SourceRule()

View File

@ -166,13 +166,6 @@ def _calculate_image_stats(image_data: np.ndarray) -> dict | None:
log.error(f"Error calculating image stats: {e}", exc_info=True) # Log exception info log.error(f"Error calculating image stats: {e}", exc_info=True) # Log exception info
return {"error": str(e)} return {"error": str(e)}
def _get_base_map_type(target_map_string: str) -> str:
"""Extracts the base map type (e.g., 'COL') from a potentially numbered string ('COL-1')."""
match = re.match(r"([a-zA-Z]+)", target_map_string)
if match:
return match.group(1).upper()
return target_map_string.upper() # Fallback if no number suffix
def _sanitize_filename(name: str) -> str: def _sanitize_filename(name: str) -> str:
"""Removes or replaces characters invalid for filenames/directory names.""" """Removes or replaces characters invalid for filenames/directory names."""
if not isinstance(name, str): name = str(name) if not isinstance(name, str): name = str(name)
@ -253,10 +246,6 @@ class ProcessingEngine:
provided in a SourceRule object and static configuration. provided in a SourceRule object and static configuration.
It does not perform classification, prediction, or rule fallback internally. It does not perform classification, prediction, or rule fallback internally.
""" """
# Define the list of known grayscale map types (adjust as needed)
# This comes from static knowledge/config, not dynamic rules.
GRAYSCALE_MAP_TYPES = ['HEIGHT', 'ROUGH', 'METAL', 'AO', 'OPC', 'MASK']
def __init__(self, config_obj: Configuration): def __init__(self, config_obj: Configuration):
""" """
Initializes the processing engine with static configuration. Initializes the processing engine with static configuration.
@ -458,6 +447,76 @@ class ProcessingEngine:
log.error(f"Failed to remove engine temporary workspace {self.temp_dir}: {e}", exc_info=True) log.error(f"Failed to remove engine temporary workspace {self.temp_dir}: {e}", exc_info=True)
self.loaded_data_cache = {} # Clear cache after cleanup self.loaded_data_cache = {} # Clear cache after cleanup
def _get_ftd_key_from_override(self, override_string: str) -> Optional[str]:
"""
Attempts to derive a base FILE_TYPE_DEFINITIONS key from an override string
which might have a variant suffix (e.g., "MAP_COL-1" -> "MAP_COL").
"""
if not override_string: # Handle empty or None override_string
return None
if override_string in self.config_obj.FILE_TYPE_DEFINITIONS:
return override_string
# Regex to remove trailing suffixes like -<digits>, -<alphanum>, _<alphanum>
# e.g., "MAP_COL-1" -> "MAP_COL", "MAP_ROUGH_variantA" -> "MAP_ROUGH"
base_candidate = re.sub(r"(-[\w\d]+|_[\w\d]+)$", "", override_string)
if base_candidate in self.config_obj.FILE_TYPE_DEFINITIONS:
return base_candidate
# log.debug(f"Could not derive FTD key from override_string '{override_string}'. Tried '{base_candidate}'.")
return None
def _get_map_variant_suffix(self, map_identifier: str, base_ftd_key: str) -> str:
"""
Extracts a variant suffix (e.g., "-1", "_variantA") from a map_identifier
if the base_ftd_key is a prefix of it and the suffix indicates a variant.
Example: map_identifier="MAP_COL-1", base_ftd_key="MAP_COL" -> returns "-1"
map_identifier="MAP_COL_variant", base_ftd_key="MAP_COL" -> returns "_variant"
map_identifier="MAP_COL", base_ftd_key="MAP_COL" -> returns ""
"""
if not base_ftd_key: # Ensure base_ftd_key is not empty
return ""
if map_identifier.startswith(base_ftd_key):
suffix = map_identifier[len(base_ftd_key):]
# Ensure suffix looks like a variant (starts with - or _) or is empty
if not suffix or suffix.startswith(('-', '_')):
return suffix
return "" # Default to no suffix
def _get_base_map_type(self, map_identifier: str) -> str:
"""
Gets the base standard type (e.g., "COL") from a map identifier (e.g., "MAP_COL-1", "COL-1"),
or returns the identifier itself if it's a merged type (e.g., "NRMRGH") or not resolvable to a standard type.
"""
if not map_identifier: # Handle empty or None map_identifier
return ""
# Try to get FTD key from "MAP_COL-1" -> "MAP_COL" or "MAP_COL" -> "MAP_COL"
ftd_key = self._get_ftd_key_from_override(map_identifier)
if ftd_key:
definition = self.config_obj.FILE_TYPE_DEFINITIONS.get(ftd_key)
if definition and definition.get("standard_type"): # Check if standard_type exists and is not empty
return definition["standard_type"] # Returns "COL"
# If map_identifier was like "COL-1" or "ROUGH" (a standard_type itself, possibly with suffix)
# Strip suffix and check if the base is a known standard_type
# Regex to get the initial part of the string composed of uppercase letters and underscores
base_candidate_match = re.match(r"([A-Z_]+)", map_identifier.upper())
if base_candidate_match:
potential_std_type = base_candidate_match.group(1)
for _, definition_val in self.config_obj.FILE_TYPE_DEFINITIONS.items():
if definition_val.get("standard_type") == potential_std_type:
return potential_std_type # Found "COL"
# If it's a merged map type (e.g., "NRMRGH"), it won't be in FTDs as a key or standard_type.
# Check if it's one of the output_map_types from MAP_MERGE_RULES.
for rule in self.config_obj.map_merge_rules:
if rule.get("output_map_type") == map_identifier:
return map_identifier # Return "NRMRGH" as is
# Fallback: return the original identifier, uppercased.
log.debug(f"_get_base_map_type: Could not determine standard base for '{map_identifier}'. Returning as is (uppercase).")
return map_identifier.upper()
def _load_and_transform_source(self, source_path_abs: Path, map_type: str, target_resolution_key: str, is_gloss_source: bool) -> Tuple[Optional[np.ndarray], Optional[np.dtype]]: def _load_and_transform_source(self, source_path_abs: Path, map_type: str, target_resolution_key: str, is_gloss_source: bool) -> Tuple[Optional[np.ndarray], Optional[np.dtype]]:
""" """
@ -467,7 +526,7 @@ class ProcessingEngine:
Args: Args:
source_path_abs: Absolute path to the source file in the workspace. source_path_abs: Absolute path to the source file in the workspace.
map_type: The standard map type (e.g., "NRM", "ROUGH", "ROUGH-1"). map_type: The item_type_override (e.g., "MAP_NRM", "MAP_ROUGH-1").
target_resolution_key: The key for the target resolution (e.g., "4K"). target_resolution_key: The key for the target resolution (e.g., "4K").
is_gloss_source: Boolean indicating if this source should be treated as gloss for inversion. is_gloss_source: Boolean indicating if this source should be treated as gloss for inversion.
@ -485,18 +544,31 @@ class ProcessingEngine:
log.debug(f"CACHE HIT: Returning cached data for {source_path_abs.name} at {target_resolution_key}") log.debug(f"CACHE HIT: Returning cached data for {source_path_abs.name} at {target_resolution_key}")
return self.loaded_data_cache[cache_key] # Return tuple (image_data, source_dtype) return self.loaded_data_cache[cache_key] # Return tuple (image_data, source_dtype)
log.debug(f"CACHE MISS: Loading and transforming {source_path_abs.name} for {target_resolution_key}") log.debug(f"CACHE MISS: Loading and transforming {source_path_abs.name} for {target_resolution_key} (map_type: {map_type})")
img_prepared = None img_prepared = None
source_dtype = None source_dtype = None
try: try:
# --- 1. Load Source Image --- # --- 1. Load Source Image ---
# Determine read flag (Grayscale for specific types, unchanged otherwise) # Determine read flag based on is_grayscale from FTD
# Use static GRAYSCALE_MAP_TYPES list ftd_key = self._get_ftd_key_from_override(map_type) # map_type is item_type_override
base_map_type = _get_base_map_type(map_type) # Get base type (e.g., ROUGH from ROUGH-1) is_map_grayscale = False
read_flag = cv2.IMREAD_GRAYSCALE if base_map_type in self.GRAYSCALE_MAP_TYPES else cv2.IMREAD_UNCHANGED standard_type_for_checks = None # For MASK check
if ftd_key:
ftd_definition = self.config_obj.FILE_TYPE_DEFINITIONS.get(ftd_key, {})
is_map_grayscale = ftd_definition.get("is_grayscale", False)
standard_type_for_checks = ftd_definition.get("standard_type")
log.debug(f"For map_type '{map_type}' (FTD key '{ftd_key}'), is_grayscale: {is_map_grayscale}, standard_type: {standard_type_for_checks}")
else:
log.warning(f"Could not determine FTD key for map_type '{map_type}' to check is_grayscale. Assuming not grayscale.")
read_flag = cv2.IMREAD_GRAYSCALE if is_map_grayscale else cv2.IMREAD_UNCHANGED
# Special case for MASK: always load unchanged first to check alpha # Special case for MASK: always load unchanged first to check alpha
if base_map_type == 'MASK': read_flag = cv2.IMREAD_UNCHANGED if standard_type_for_checks == 'MASK':
log.debug(f"Map type '{map_type}' (standard_type 'MASK') will be loaded with IMREAD_UNCHANGED for alpha check.")
read_flag = cv2.IMREAD_UNCHANGED
log.debug(f"Loading source {source_path_abs.name} with flag: {'GRAYSCALE' if read_flag == cv2.IMREAD_GRAYSCALE else 'UNCHANGED'}") log.debug(f"Loading source {source_path_abs.name} with flag: {'GRAYSCALE' if read_flag == cv2.IMREAD_GRAYSCALE else 'UNCHANGED'}")
img_loaded = cv2.imread(str(source_path_abs), read_flag) img_loaded = cv2.imread(str(source_path_abs), read_flag)
@ -509,18 +581,17 @@ class ProcessingEngine:
img_prepared = img_loaded # Start with loaded image img_prepared = img_loaded # Start with loaded image
# MASK Handling (Extract alpha or convert) - Do this BEFORE general color conversions # MASK Handling (Extract alpha or convert) - Do this BEFORE general color conversions
if base_map_type == 'MASK': if standard_type_for_checks == 'MASK':
log.debug(f"Processing as MASK type for {source_path_abs.name}.") log.debug(f"Processing as MASK type for {source_path_abs.name}.")
shape = img_prepared.shape shape = img_prepared.shape
if len(shape) == 3 and shape[2] == 4: if len(shape) == 3 and shape[2] == 4: # BGRA or RGBA (OpenCV loads BGRA)
log.debug("MASK processing: Extracting alpha channel (4-channel source).") log.debug("MASK processing: Extracting alpha channel (4-channel source).")
img_prepared = img_prepared[:, :, 3] # Extract alpha img_prepared = img_prepared[:, :, 3] # Extract alpha
elif len(shape) == 3 and shape[2] == 3: elif len(shape) == 3 and shape[2] == 3: # BGR or RGB
log.debug("MASK processing: Converting BGR to Grayscale (3-channel source).") # OpenCV loads as BGR log.debug("MASK processing: Converting 3-channel source to Grayscale.")
img_prepared = cv2.cvtColor(img_prepared, cv2.COLOR_BGR2GRAY) # Convert BGR to Gray img_prepared = cv2.cvtColor(img_prepared, cv2.COLOR_BGR2GRAY if read_flag != cv2.IMREAD_GRAYSCALE else cv2.COLOR_RGB2GRAY) # If loaded UNCHANGED and 3-channel, assume BGR
elif len(shape) == 2: elif len(shape) == 2:
log.debug("MASK processing: Source is already grayscale.") log.debug("MASK processing: Source is already grayscale.")
# img_prepared remains img_prepared
else: else:
log.warning(f"MASK processing: Unexpected source shape {shape}. Cannot reliably extract mask.") log.warning(f"MASK processing: Unexpected source shape {shape}. Cannot reliably extract mask.")
img_prepared = None # Cannot process img_prepared = None # Cannot process
@ -528,58 +599,47 @@ class ProcessingEngine:
# BGR -> RGB conversion (only for 3/4-channel images not loaded as grayscale) # BGR -> RGB conversion (only for 3/4-channel images not loaded as grayscale)
if len(img_prepared.shape) == 3 and img_prepared.shape[2] >= 3 and read_flag != cv2.IMREAD_GRAYSCALE: if len(img_prepared.shape) == 3 and img_prepared.shape[2] >= 3 and read_flag != cv2.IMREAD_GRAYSCALE:
log.debug(f"Converting loaded image from BGR to RGB for {source_path_abs.name}.") log.debug(f"Converting loaded image from BGR to RGB for {source_path_abs.name}.")
if img_prepared.shape[2] == 4: # BGRA -> RGB if img_prepared.shape[2] == 4: # BGRA -> RGBA (then to RGB)
img_prepared = cv2.cvtColor(img_prepared, cv2.COLOR_BGRA2RGB) img_prepared = cv2.cvtColor(img_prepared, cv2.COLOR_BGRA2RGB) # OpenCV BGRA to RGB
else: # BGR -> RGB else: # BGR -> RGB
img_prepared = cv2.cvtColor(img_prepared, cv2.COLOR_BGR2RGB) img_prepared = cv2.cvtColor(img_prepared, cv2.COLOR_BGR2RGB)
elif len(img_prepared.shape) == 2: elif len(img_prepared.shape) == 2:
log.debug(f"Image {source_path_abs.name} is grayscale, no BGR->RGB conversion needed.") log.debug(f"Image {source_path_abs.name} is grayscale or loaded as such, no BGR->RGB conversion needed.")
# else: log warning handled below
if img_prepared is None: raise ProcessingEngineError("Image data is None after MASK/Color prep.") if img_prepared is None: raise ProcessingEngineError("Image data is None after MASK/Color prep.")
# Gloss -> Roughness Inversion (only if map_type starts with ROUGH and is_gloss_source is True) # Gloss -> Roughness Inversion
log.debug(f"Gloss Inversion Check: map_type='{map_type}', is_gloss_source={is_gloss_source}") # DEBUG ADDED # map_type is item_type_override, e.g. "MAP_ROUGH-1"
condition_met = map_type.startswith('ROUGH') and is_gloss_source # DEBUG ADDED # standard_type_for_checks is "ROUGH"
log.debug(f"Gloss Inversion Check: Condition met = {condition_met}") # DEBUG ADDED if standard_type_for_checks == 'ROUGH' and is_gloss_source:
if condition_met: log.info(f"Performing Gloss->Roughness inversion for {source_path_abs.name} (map_type: {map_type})")
log.info(f"Performing Gloss->Roughness inversion for {source_path_abs.name}")
# Ensure grayscale before inversion
if len(img_prepared.shape) == 3: if len(img_prepared.shape) == 3:
log.debug("Gloss Inversion: Converting 3-channel image to grayscale before inversion.") # DEBUG ADDED log.debug("Gloss Inversion: Converting 3-channel image to grayscale before inversion.")
img_prepared = cv2.cvtColor(img_prepared, cv2.COLOR_RGB2GRAY) # Use RGB2GRAY as it should be RGB now img_prepared = cv2.cvtColor(img_prepared, cv2.COLOR_RGB2GRAY) # Should be RGB at this point if 3-channel
# Log stats *before* inversion (after potential grayscale conversion) stats_before = _calculate_image_stats(img_prepared)
stats_before = _calculate_image_stats(img_prepared) # DEBUG ADDED log.debug(f"Gloss Inversion: Image stats BEFORE inversion: {stats_before}")
log.debug(f"Gloss Inversion: Image stats BEFORE inversion: {stats_before}") # DEBUG ADDED
# Normalize based on original source dtype before inversion
if source_dtype == np.uint16: if source_dtype == np.uint16:
log.debug("Gloss Inversion: Normalizing uint16 data for inversion.") # DEBUG ADDED
img_float = 1.0 - (img_prepared.astype(np.float32) / 65535.0) img_float = 1.0 - (img_prepared.astype(np.float32) / 65535.0)
elif source_dtype == np.uint8: elif source_dtype == np.uint8:
log.debug("Gloss Inversion: Normalizing uint8 data for inversion.") # DEBUG ADDED
img_float = 1.0 - (img_prepared.astype(np.float32) / 255.0) img_float = 1.0 - (img_prepared.astype(np.float32) / 255.0)
else: # Assuming float input is already 0-1 range else: # Assuming float input is already 0-1 range
log.debug("Gloss Inversion: Assuming float data is already normalized for inversion.") # DEBUG ADDED
img_float = 1.0 - img_prepared.astype(np.float32) img_float = 1.0 - img_prepared.astype(np.float32)
img_prepared = np.clip(img_float, 0.0, 1.0) # Result is float32 img_prepared = np.clip(img_float, 0.0, 1.0) # Result is float32
# Log stats *after* inversion stats_after = _calculate_image_stats(img_prepared)
stats_after = _calculate_image_stats(img_prepared) # DEBUG ADDED log.debug(f"Gloss Inversion: Image stats AFTER inversion (float32): {stats_after}")
log.debug(f"Gloss Inversion: Image stats AFTER inversion (float32): {stats_after}") # DEBUG ADDED
log.debug(f"Inverted gloss map stored as float32 for ROUGH, original dtype: {source_dtype}") log.debug(f"Inverted gloss map stored as float32 for ROUGH, original dtype: {source_dtype}")
# Ensure data is float32/uint8/uint16 for resizing compatibility # Ensure data is float32/uint8/uint16 for resizing compatibility
if isinstance(img_prepared, np.ndarray) and img_prepared.dtype not in [np.uint8, np.uint16, np.float32, np.float16]: if isinstance(img_prepared, np.ndarray) and img_prepared.dtype not in [np.uint8, np.uint16, np.float32, np.float16]:
log.warning(f"Converting unexpected dtype {img_prepared.dtype} to float32 before resizing.") log.warning(f"Converting unexpected dtype {img_prepared.dtype} to float32 before resizing for {source_path_abs.name}.")
img_prepared = img_prepared.astype(np.float32) img_prepared = img_prepared.astype(np.float32)
# --- 3. Resize --- # --- 3. Resize ---
if img_prepared is None: raise ProcessingEngineError("Image data is None after initial prep.") if img_prepared is None: raise ProcessingEngineError(f"Image data is None after initial prep for {source_path_abs.name}.")
orig_h, orig_w = img_prepared.shape[:2] orig_h, orig_w = img_prepared.shape[:2]
# Get resolutions from static config # Get resolutions from static config
target_dim_px = self.config_obj.image_resolutions.get(target_resolution_key) target_dim_px = self.config_obj.image_resolutions.get(target_resolution_key)
@ -620,108 +680,96 @@ class ProcessingEngine:
return None, None return None, None
def _save_image(self, image_data: np.ndarray, supplier_name: str, asset_name: str, map_type: str, resolution_key: str, source_info: dict, output_bit_depth_rule: str) -> Optional[Dict]: # <-- UPDATED SIGNATURE def _save_image(self, image_data: np.ndarray, supplier_name: str, asset_name: str, current_map_identifier: str, resolution_key: str, source_info: dict, output_bit_depth_rule: str) -> Optional[Dict]:
""" """
Handles saving an image NumPy array to a temporary file within the engine's temp_dir using token-based path generation. Handles saving an image NumPy array to a temporary file within the engine's temp_dir using token-based path generation.
Uses static configuration from self.config_obj for formats, quality, etc. Uses static configuration from self.config_obj for formats, quality, etc.
The 'maptype' token for the filename is derived based on standard_type and variants.
Args: Args:
image_data: NumPy array containing the image data to save. image_data: NumPy array containing the image data to save.
supplier_name: The effective supplier name for the asset. supplier_name: The effective supplier name for the asset.
asset_name: The name of the asset. asset_name: The name of the asset.
map_type: The standard map type being saved (e.g., "COL", "NRMRGH"). current_map_identifier: The map type being saved (e.g., "MAP_COL", "MAP_ROUGH-1", "NRMRGH"). This is item_type_override or merged map type.
resolution_key: The resolution key (e.g., "4K"). resolution_key: The resolution key (e.g., "4K").
source_info: Dictionary containing details about the source(s), e.g., source_info: Dictionary containing details about the source(s).
{'original_extension': '.tif', 'source_bit_depth': 16, 'involved_extensions': {'.tif', '.png'}, 'max_input_bit_depth': 16} output_bit_depth_rule: Rule for determining output bit depth.
output_bit_depth_rule: Rule for determining output bit depth ('respect', 'force_8bit', 'force_16bit', 'respect_inputs').
Returns: Returns:
A dictionary containing details of the saved file (path relative to engine's temp_dir, A dictionary containing details of the saved file or None if saving failed.
width, height, bit_depth, format) or None if saving failed.
""" """
if cv2 is None or np is None: if cv2 is None or np is None:
log.error("OpenCV or NumPy not available for image saving.") log.error("OpenCV or NumPy not available for image saving.")
return None return None
if image_data is None: if image_data is None:
log.error(f"Cannot save image for {map_type} ({resolution_key}): image_data is None.") log.error(f"Cannot save image for {current_map_identifier} ({resolution_key}): image_data is None.")
return None return None
if not self.temp_dir or not self.temp_dir.exists(): if not self.temp_dir or not self.temp_dir.exists():
log.error(f"Cannot save image for {map_type} ({resolution_key}): Engine temp_dir is invalid.") log.error(f"Cannot save image for {current_map_identifier} ({resolution_key}): Engine temp_dir is invalid.")
return None return None
try: try:
h, w = image_data.shape[:2] h, w = image_data.shape[:2]
current_dtype = image_data.dtype current_dtype = image_data.dtype
log.debug(f"Saving {map_type} ({resolution_key}) for asset '{asset_name}'. Input shape: {image_data.shape}, dtype: {current_dtype}") log.debug(f"Saving {current_map_identifier} ({resolution_key}) for asset '{asset_name}'. Input shape: {image_data.shape}, dtype: {current_dtype}")
# --- Get Static Config Values --- config = self.config_obj
config = self.config_obj # Alias for brevity
primary_fmt_16, fallback_fmt_16 = config.get_16bit_output_formats() primary_fmt_16, fallback_fmt_16 = config.get_16bit_output_formats()
fmt_8bit_config = config.get_8bit_output_format() fmt_8bit_config = config.get_8bit_output_format()
threshold = config.resolution_threshold_for_jpg threshold = config.resolution_threshold_for_jpg
force_lossless_map_types = config.force_lossless_map_types force_lossless_map_types = config.force_lossless_map_types # Should contain standard_types like "NRM", "DISP"
jpg_quality = config.jpg_quality jpg_quality = config.jpg_quality
png_compression_level = config._core_settings.get('PNG_COMPRESSION_LEVEL', 6) png_compression_level = config._core_settings.get('PNG_COMPRESSION_LEVEL', 6)
# target_filename_pattern = config.target_filename_pattern # <-- REMOVED (using new pattern)
image_resolutions = config.image_resolutions image_resolutions = config.image_resolutions
# Get the new separate patterns from config
output_directory_pattern = config.get('OUTPUT_DIRECTORY_PATTERN', '[supplier]/[assetname]') output_directory_pattern = config.get('OUTPUT_DIRECTORY_PATTERN', '[supplier]/[assetname]')
output_filename_pattern = config.get('OUTPUT_FILENAME_PATTERN', '[assetname]_[maptype]_[resolution].[ext]') output_filename_pattern = config.get('OUTPUT_FILENAME_PATTERN', '[assetname]_[maptype]_[resolution].[ext]')
# --- 1. Determine Output Bit Depth --- # --- 1. Determine Output Bit Depth ---
source_bpc = source_info.get('source_bit_depth', 8) # Default to 8 if missing source_bpc = source_info.get('source_bit_depth', 8)
max_input_bpc = source_info.get('max_input_bit_depth', source_bpc) # For 'respect_inputs' merge rule max_input_bpc = source_info.get('max_input_bit_depth', source_bpc)
output_dtype_target, output_bit_depth = np.uint8, 8 # Default output_dtype_target, output_bit_depth = np.uint8, 8
if output_bit_depth_rule == 'force_8bit': if output_bit_depth_rule == 'force_8bit': output_dtype_target, output_bit_depth = np.uint8, 8
output_dtype_target, output_bit_depth = np.uint8, 8 elif output_bit_depth_rule == 'force_16bit': output_dtype_target, output_bit_depth = np.uint16, 16
elif output_bit_depth_rule == 'force_16bit': elif output_bit_depth_rule == 'respect':
output_dtype_target, output_bit_depth = np.uint16, 16
elif output_bit_depth_rule == 'respect': # For individual maps
if source_bpc == 16: output_dtype_target, output_bit_depth = np.uint16, 16 if source_bpc == 16: output_dtype_target, output_bit_depth = np.uint16, 16
# Handle float source? Assume 16-bit output if source was float? Needs clarification. elif output_bit_depth_rule == 'respect_inputs':
# For now, stick to uint8/16 based on source_bpc. if max_input_bpc == 16: output_dtype_target, output_bit_depth = np.uint16, 16
elif output_bit_depth_rule == 'respect_inputs': # For merged maps else:
if max_input_bpc == 16: output_dtype_target, output_bit_depth = np.uint16, 16
else: # Default to 8-bit if rule is unknown
log.warning(f"Unknown output_bit_depth_rule '{output_bit_depth_rule}'. Defaulting to 8-bit.") log.warning(f"Unknown output_bit_depth_rule '{output_bit_depth_rule}'. Defaulting to 8-bit.")
output_dtype_target, output_bit_depth = np.uint8, 8 output_dtype_target, output_bit_depth = np.uint8, 8
log.debug(f"Target output bit depth: {output_bit_depth}-bit for {current_map_identifier}")
log.debug(f"Target output bit depth: {output_bit_depth}-bit (dtype: {output_dtype_target.__name__}) based on rule '{output_bit_depth_rule}'")
# --- 2. Determine Output Format --- # --- 2. Determine Output Format ---
output_format, output_ext, save_params, needs_float16 = "", "", [], False output_format, output_ext, save_params, needs_float16 = "", "", [], False
base_map_type = _get_base_map_type(map_type) # Use base type for lossless check # Use the (potentially suffixed) standard_type for lossless check
force_lossless = base_map_type in force_lossless_map_types base_standard_type_for_lossless_check = self._get_base_map_type(current_map_identifier) # "COL", "NRM", "DISP-Detail" -> "DISP"
original_extension = source_info.get('original_extension', '.png') # Primary source ext
involved_extensions = source_info.get('involved_extensions', {original_extension}) # For merges # Check if the pure standard type (without suffix) is in force_lossless_map_types
target_dim_px = image_resolutions.get(resolution_key, 0) # Get target dimension size pure_standard_type = self._get_ftd_key_from_override(base_standard_type_for_lossless_check) # Get FTD key if possible
std_type_from_ftd = None
if pure_standard_type and pure_standard_type in self.config_obj.FILE_TYPE_DEFINITIONS:
std_type_from_ftd = self.config_obj.FILE_TYPE_DEFINITIONS[pure_standard_type].get("standard_type")
# Use std_type_from_ftd if available and non-empty, else base_standard_type_for_lossless_check
check_type_for_lossless = std_type_from_ftd if std_type_from_ftd else base_standard_type_for_lossless_check
force_lossless = check_type_for_lossless in force_lossless_map_types
original_extension = source_info.get('original_extension', '.png')
involved_extensions = source_info.get('involved_extensions', {original_extension})
target_dim_px = image_resolutions.get(resolution_key, 0)
# Apply format determination logic (using static config)
if force_lossless: if force_lossless:
log.debug(f"Format forced to lossless for map type '{base_map_type}'.") log.debug(f"Format forced to lossless for map type '{current_map_identifier}' (checked as '{check_type_for_lossless}').")
if output_bit_depth == 16: if output_bit_depth == 16:
output_format = primary_fmt_16 output_format = primary_fmt_16
if output_format.startswith("exr"): if output_format.startswith("exr"): output_ext, needs_float16 = ".exr", True; save_params.extend([cv2.IMWRITE_EXR_TYPE, cv2.IMWRITE_EXR_TYPE_HALF])
output_ext, needs_float16 = ".exr", True else: output_format = fallback_fmt_16 if fallback_fmt_16 == "png" else "png"; output_ext = ".png"; save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, png_compression_level])
save_params.extend([cv2.IMWRITE_EXR_TYPE, cv2.IMWRITE_EXR_TYPE_HALF]) else: output_format, output_ext = "png", ".png"; save_params = [cv2.IMWRITE_PNG_COMPRESSION, png_compression_level]
else: # Assume PNG if primary 16-bit isn't EXR
if output_format != "png": log.warning(f"Primary 16-bit format '{output_format}' not PNG/EXR for forced lossless. Using fallback '{fallback_fmt_16}'.")
output_format = fallback_fmt_16 if fallback_fmt_16 == "png" else "png" # Ensure PNG
output_ext = ".png"
save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, png_compression_level])
else: # 8-bit lossless -> PNG
output_format = "png"; output_ext = ".png"
save_params = [cv2.IMWRITE_PNG_COMPRESSION, png_compression_level]
elif output_bit_depth == 8 and target_dim_px >= threshold: elif output_bit_depth == 8 and target_dim_px >= threshold:
output_format = 'jpg'; output_ext = '.jpg' output_format = 'jpg'; output_ext = '.jpg'; save_params.extend([cv2.IMWRITE_JPEG_QUALITY, jpg_quality])
save_params.extend([cv2.IMWRITE_JPEG_QUALITY, jpg_quality])
log.debug(f"Using JPG format (Quality: {jpg_quality}) for {map_type} at {resolution_key} due to resolution threshold ({target_dim_px} >= {threshold}).")
else: else:
# Determine highest format involved (for merges) or use original (for individuals) highest_format_str = 'jpg'
highest_format_str = 'jpg' # Default lowest
relevant_extensions = involved_extensions # Use involved_extensions directly
if '.exr' in relevant_extensions: highest_format_str = 'exr' if '.exr' in relevant_extensions: highest_format_str = 'exr'
elif '.tif' in relevant_extensions: highest_format_str = 'tif' elif '.tif' in relevant_extensions: highest_format_str = 'tif'
elif '.png' in relevant_extensions: highest_format_str = 'png' elif '.png' in relevant_extensions: highest_format_str = 'png'
@ -729,44 +777,33 @@ class ProcessingEngine:
if highest_format_str == 'exr': if highest_format_str == 'exr':
if output_bit_depth == 16: output_format, output_ext, needs_float16 = "exr", ".exr", True; save_params.extend([cv2.IMWRITE_EXR_TYPE, cv2.IMWRITE_EXR_TYPE_HALF]) if output_bit_depth == 16: output_format, output_ext, needs_float16 = "exr", ".exr", True; save_params.extend([cv2.IMWRITE_EXR_TYPE, cv2.IMWRITE_EXR_TYPE_HALF])
else: output_format, output_ext = "png", ".png"; save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, png_compression_level]) else: output_format, output_ext = "png", ".png"; save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, png_compression_level])
elif highest_format_str == 'tif': elif highest_format_str == 'tif' or highest_format_str == 'png': # Treat TIF like PNG for output choice here
if output_bit_depth == 16: if output_bit_depth == 16:
output_format = primary_fmt_16 output_format = primary_fmt_16
if output_format.startswith("exr"): output_ext, needs_float16 = ".exr", True; save_params.extend([cv2.IMWRITE_EXR_TYPE, cv2.IMWRITE_EXR_TYPE_HALF]) if output_format.startswith("exr"): output_ext, needs_float16 = ".exr", True; save_params.extend([cv2.IMWRITE_EXR_TYPE, cv2.IMWRITE_EXR_TYPE_HALF])
else: output_format = "png"; output_ext = ".png"; save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, png_compression_level]) else: output_format = "png"; output_ext = ".png"; save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, png_compression_level])
else: output_format, output_ext = "png", ".png"; save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, png_compression_level]) else: output_format, output_ext = "png", ".png"; save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, png_compression_level])
elif highest_format_str == 'png': else: # Default to configured 8-bit format
if output_bit_depth == 16:
output_format = primary_fmt_16
if output_format.startswith("exr"): output_ext, needs_float16 = ".exr", True; save_params.extend([cv2.IMWRITE_EXR_TYPE, cv2.IMWRITE_EXR_TYPE_HALF])
else: output_format = "png"; output_ext = ".png"; save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, png_compression_level])
else: output_format, output_ext = "png", ".png"; save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, png_compression_level])
else: # Default to configured 8-bit format if highest was JPG or unknown
output_format = fmt_8bit_config; output_ext = f".{output_format}" output_format = fmt_8bit_config; output_ext = f".{output_format}"
if output_format == "png": save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, png_compression_level]) if output_format == "png": save_params.extend([cv2.IMWRITE_PNG_COMPRESSION, png_compression_level])
elif output_format == "jpg": save_params.extend([cv2.IMWRITE_JPEG_QUALITY, jpg_quality]) elif output_format == "jpg": save_params.extend([cv2.IMWRITE_JPEG_QUALITY, jpg_quality])
# Final check: JPG must be 8-bit
if output_format == "jpg" and output_bit_depth == 16: if output_format == "jpg" and output_bit_depth == 16:
log.warning(f"Output format is JPG, but target bit depth is 16. Forcing 8-bit for {map_type} ({resolution_key}).") log.warning(f"Output format JPG, but target 16-bit. Forcing 8-bit for {current_map_identifier}.")
output_dtype_target, output_bit_depth = np.uint8, 8 output_dtype_target, output_bit_depth = np.uint8, 8
log.debug(f"Determined save format for {current_map_identifier}: {output_format}, ext: {output_ext}, bit_depth: {output_bit_depth}")
log.debug(f"Determined save format: {output_format}, ext: {output_ext}, bit_depth: {output_bit_depth}, needs_float16: {needs_float16}")
# --- 3. Final Data Type Conversion --- # --- 3. Final Data Type Conversion ---
img_to_save = image_data.copy() # Work on a copy img_to_save = image_data.copy()
if output_dtype_target == np.uint8 and img_to_save.dtype != np.uint8: if output_dtype_target == np.uint8 and img_to_save.dtype != np.uint8:
log.debug(f"Converting image data from {img_to_save.dtype} to uint8 for saving.")
if img_to_save.dtype == np.uint16: img_to_save = (img_to_save.astype(np.float32) / 65535.0 * 255.0).astype(np.uint8) if img_to_save.dtype == np.uint16: img_to_save = (img_to_save.astype(np.float32) / 65535.0 * 255.0).astype(np.uint8)
elif img_to_save.dtype in [np.float16, np.float32]: img_to_save = (np.clip(img_to_save, 0.0, 1.0) * 255.0).astype(np.uint8) elif img_to_save.dtype in [np.float16, np.float32]: img_to_save = (np.clip(img_to_save, 0.0, 1.0) * 255.0).astype(np.uint8)
else: img_to_save = img_to_save.astype(np.uint8) # Direct cast for other types (e.g., bool) else: img_to_save = img_to_save.astype(np.uint8)
elif output_dtype_target == np.uint16 and img_to_save.dtype != np.uint16: elif output_dtype_target == np.uint16 and img_to_save.dtype != np.uint16:
log.debug(f"Converting image data from {img_to_save.dtype} to uint16 for saving.") if img_to_save.dtype == np.uint8: img_to_save = img_to_save.astype(np.uint16) * 257
if img_to_save.dtype == np.uint8: img_to_save = img_to_save.astype(np.uint16) * 257 # Proper 8->16 bit scaling
elif img_to_save.dtype in [np.float16, np.float32]: img_to_save = (np.clip(img_to_save, 0.0, 1.0) * 65535.0).astype(np.uint16) elif img_to_save.dtype in [np.float16, np.float32]: img_to_save = (np.clip(img_to_save, 0.0, 1.0) * 65535.0).astype(np.uint16)
else: img_to_save = img_to_save.astype(np.uint16) else: img_to_save = img_to_save.astype(np.uint16)
if needs_float16 and img_to_save.dtype != np.float16: if needs_float16 and img_to_save.dtype != np.float16:
log.debug(f"Converting image data from {img_to_save.dtype} to float16 for EXR saving.")
if img_to_save.dtype == np.uint16: img_to_save = (img_to_save.astype(np.float32) / 65535.0).astype(np.float16) if img_to_save.dtype == np.uint16: img_to_save = (img_to_save.astype(np.float32) / 65535.0).astype(np.float16)
elif img_to_save.dtype == np.uint8: img_to_save = (img_to_save.astype(np.float32) / 255.0).astype(np.float16) elif img_to_save.dtype == np.uint8: img_to_save = (img_to_save.astype(np.float32) / 255.0).astype(np.float16)
elif img_to_save.dtype == np.float32: img_to_save = img_to_save.astype(np.float16) elif img_to_save.dtype == np.float32: img_to_save = img_to_save.astype(np.float16)
@ -774,47 +811,66 @@ class ProcessingEngine:
# --- 4. Final Color Space Conversion (RGB -> BGR for non-EXR) --- # --- 4. Final Color Space Conversion (RGB -> BGR for non-EXR) ---
img_save_final = img_to_save img_save_final = img_to_save
is_3_channel = len(img_to_save.shape) == 3 and img_to_save.shape[2] == 3 if len(img_to_save.shape) == 3 and img_to_save.shape[2] == 3 and not output_format.startswith("exr"):
if is_3_channel and not output_format.startswith("exr"): try: img_save_final = cv2.cvtColor(img_to_save, cv2.COLOR_RGB2BGR)
log.debug(f"Converting RGB to BGR for saving {map_type} ({resolution_key}) as {output_format}") except Exception as cvt_err: log.error(f"RGB->BGR conversion failed for {current_map_identifier}: {cvt_err}. Saving original.");
try:
img_save_final = cv2.cvtColor(img_to_save, cv2.COLOR_RGB2BGR)
except Exception as cvt_err:
log.error(f"Failed RGB->BGR conversion before save for {map_type} ({resolution_key}): {cvt_err}. Saving original RGB.")
img_save_final = img_to_save # Fallback
# --- 5. Construct Path using Token Pattern & Save --- # --- 5. Determine maptype token for filename ---
filename_map_type_token: str
is_merged_map = any(rule.get("output_map_type") == current_map_identifier for rule in self.config_obj.map_merge_rules)
if is_merged_map:
filename_map_type_token = current_map_identifier # e.g., "NRMRGH"
else:
base_ftd_key = self._get_ftd_key_from_override(current_map_identifier) # e.g., "MAP_COL"
if base_ftd_key:
definition = self.config_obj.FILE_TYPE_DEFINITIONS.get(base_ftd_key)
if definition and "standard_type" in definition:
standard_type_alias = definition["standard_type"] # e.g., "COL"
if standard_type_alias: # Ensure not empty
variant_suffix = self._get_map_variant_suffix(current_map_identifier, base_ftd_key) # e.g., "-1" or ""
if standard_type_alias in self.config_obj.respect_variant_map_types:
filename_map_type_token = standard_type_alias + variant_suffix # e.g., "COL-1"
else:
filename_map_type_token = standard_type_alias # e.g., "COL"
else:
log.warning(f"Empty standard_type for FTD key '{base_ftd_key}'. Using identifier '{current_map_identifier}' for maptype token.")
filename_map_type_token = current_map_identifier
else:
log.warning(f"No definition or standard_type for FTD key '{base_ftd_key}'. Using identifier '{current_map_identifier}' for maptype token.")
filename_map_type_token = current_map_identifier
else:
log.warning(f"Could not derive FTD key from '{current_map_identifier}'. Using it directly for maptype token.")
filename_map_type_token = current_map_identifier
log.debug(f"Filename maptype token for '{current_map_identifier}' is '{filename_map_type_token}'")
# --- 6. Construct Path using Token Pattern & Save ---
token_data = { token_data = {
"supplier": _sanitize_filename(supplier_name), "supplier": _sanitize_filename(supplier_name),
"assetname": _sanitize_filename(asset_name), "assetname": _sanitize_filename(asset_name),
"maptype": map_type, "maptype": filename_map_type_token, # Use the derived token
"resolution": resolution_key, "resolution": resolution_key,
"width": w, "width": w, "height": h,
"height": h,
"bitdepth": output_bit_depth, "bitdepth": output_bit_depth,
"ext": output_ext.lstrip('.') "ext": output_ext.lstrip('.')
} }
# Add optional token data if available
if hasattr(self, 'current_incrementing_value') and self.current_incrementing_value is not None: if hasattr(self, 'current_incrementing_value') and self.current_incrementing_value is not None:
token_data['incrementingvalue'] = self.current_incrementing_value token_data['incrementingvalue'] = self.current_incrementing_value
if hasattr(self, 'current_sha5_value') and self.current_sha5_value is not None: if hasattr(self, 'current_sha5_value') and self.current_sha5_value is not None:
token_data['sha5'] = self.current_sha5_value token_data['sha5'] = self.current_sha5_value
log.debug(f"Token data for _save_image path generation: {token_data}") # DEBUG LOG
try: try:
# Generate directory and filename separately
relative_dir_path_str = generate_path_from_pattern(output_directory_pattern, token_data) relative_dir_path_str = generate_path_from_pattern(output_directory_pattern, token_data)
filename_str = generate_path_from_pattern(output_filename_pattern, token_data) filename_str = generate_path_from_pattern(output_filename_pattern, token_data)
# Combine for the full relative path string needed for saving and returning
full_relative_path_str = str(Path(relative_dir_path_str) / filename_str) full_relative_path_str = str(Path(relative_dir_path_str) / filename_str)
except Exception as path_gen_err: except Exception as path_gen_err:
log.error(f"Failed to generate output path using patterns '{output_directory_pattern}' / '{output_filename_pattern}' and data {token_data}: {path_gen_err}", exc_info=True) log.error(f"Failed to generate output path for {current_map_identifier} with data {token_data}: {path_gen_err}", exc_info=True)
return None # Cannot proceed without a path return None
output_path_temp = self.temp_dir / full_relative_path_str # Save to engine's temp dir, preserving structure output_path_temp = self.temp_dir / full_relative_path_str
log.debug(f"Attempting to save to temporary path: {output_path_temp} (Format: {output_format}, Dtype: {img_save_final.dtype})") log.debug(f"Attempting to save {current_map_identifier} to temporary path: {output_path_temp}")
# Ensure parent directory exists in temp (using the full path)
try: try:
output_path_temp.parent.mkdir(parents=True, exist_ok=True) output_path_temp.parent.mkdir(parents=True, exist_ok=True)
except Exception as mkdir_err: except Exception as mkdir_err:
@ -959,14 +1015,14 @@ class ProcessingEngine:
file_rule.item_type_override is None or file_rule.item_type_override is None or
file_rule.item_type_override == "EXTRA" or # Explicitly skip "EXTRA" type file_rule.item_type_override == "EXTRA" or # Explicitly skip "EXTRA" type
getattr(file_rule, 'skip_processing', False) or getattr(file_rule, 'skip_processing', False) or
(hasattr(file_rule, 'file_type') and file_rule.file_type == "FILE_IGNORE") # Skip files marked as FILE_IGNORE file_rule.item_type == "FILE_IGNORE" # Consolidated check: Use item_type for base classification
) )
if should_skip: if should_skip:
skip_reason = [] skip_reason = []
if file_rule.item_type_override is None: skip_reason.append("No ItemTypeOverride") if file_rule.item_type_override is None: skip_reason.append("No ItemTypeOverride")
if file_rule.item_type_override == "EXTRA": skip_reason.append("Explicitly EXTRA type") if file_rule.item_type_override == "EXTRA": skip_reason.append("Explicitly EXTRA type")
if getattr(file_rule, 'skip_processing', False): skip_reason.append("SkipProcessing flag set") if getattr(file_rule, 'skip_processing', False): skip_reason.append("SkipProcessing flag set")
if hasattr(file_rule, 'file_type') and file_rule.file_type == "FILE_IGNORE": skip_reason.append("FILE_IGNORE type") if file_rule.item_type == "FILE_IGNORE": skip_reason.append("ItemType is FILE_IGNORE")
log.debug(f"Skipping individual processing for {file_rule.file_path} ({', '.join(skip_reason)})") log.debug(f"Skipping individual processing for {file_rule.file_path} ({', '.join(skip_reason)})")
continue # Skip to the next file_rule continue # Skip to the next file_rule
@ -1056,13 +1112,12 @@ class ProcessingEngine:
save_result = self._save_image( save_result = self._save_image(
image_data=img_resized, image_data=img_resized,
supplier_name=supplier_name, # <-- ADDED supplier_name=supplier_name,
asset_name=base_name, # <-- ADDED (using base_name alias) asset_name=base_name,
map_type=save_map_type, # Use the determined map type for saving current_map_identifier=save_map_type, # Pass the map type to be saved
resolution_key=res_key, resolution_key=res_key,
source_info=source_info, source_info=source_info,
output_bit_depth_rule=bit_depth_rule output_bit_depth_rule=bit_depth_rule
# asset_base_name removed, _save_image uses self.config_obj for other settings
) )
# --- 5. Store Result --- # --- 5. Store Result ---
@ -1323,16 +1378,15 @@ class ProcessingEngine:
supplier_name = current_asset_metadata.get("supplier_name", "UnknownSupplier") supplier_name = current_asset_metadata.get("supplier_name", "UnknownSupplier")
save_result = self._save_image( save_result = self._save_image(
image_data=merged_image_float32, # Pass the merged float32 data image_data=merged_image_float32,
supplier_name=supplier_name, # <-- ADDED supplier_name=supplier_name,
asset_name=base_name, # <-- ADDED (using base_name alias) asset_name=base_name,
map_type=output_map_type, current_map_identifier=output_map_type, # Merged map type
resolution_key=current_res_key, resolution_key=current_res_key,
source_info=source_info_for_save, # Pass collected source info source_info=source_info_for_save,
output_bit_depth_rule=rule_bit_depth # Pass the rule's requirement output_bit_depth_rule=rule_bit_depth
# asset_base_name removed, _save_image uses self.config_obj for other settings
) )
# --- Record details locally --- # --- Record details locally ---
if save_result: if save_result:
merged_maps_details_asset[output_map_type][current_res_key] = save_result merged_maps_details_asset[output_map_type][current_res_key] = save_result
@ -1376,6 +1430,7 @@ class ProcessingEngine:
# Start with the base metadata passed in (already contains name, category, archetype, stats, aspect, map_details) # Start with the base metadata passed in (already contains name, category, archetype, stats, aspect, map_details)
final_metadata = current_asset_metadata.copy() final_metadata = current_asset_metadata.copy()
final_metadata["category"] = asset_rule.asset_type # Ensure standardized asset type is in metadata
# Use the effective supplier passed as argument # Use the effective supplier passed as argument
final_metadata["supplier_name"] = effective_supplier # Already determined in process() final_metadata["supplier_name"] = effective_supplier # Already determined in process()
@ -1398,19 +1453,21 @@ class ProcessingEngine:
# Determine shader features based on this asset's maps and rules # Determine shader features based on this asset's maps and rules
features = set() features = set()
map_details_asset = final_metadata.get("map_details", {}) # Get from metadata dict map_details_asset = final_metadata.get("map_details", {}) # Get from metadata dict
for map_type, details in map_details_asset.items(): for map_type, details in map_details_asset.items(): # map_type here is item_type_override like "MAP_COL-1"
base_map_type = _get_base_map_type(map_type) base_standard_type = self._get_base_map_type(map_type) # Should give "COL"
# Check standard feature types # Check standard feature types
if base_map_type in ["SSS", "FUZZ", "MASK", "TRANSMISSION", "EMISSION", "CLEARCOAT"]: # Add more as needed if base_standard_type in ["SSS", "FUZZ", "MASK", "TRANSMISSION", "EMISSION", "CLEARCOAT"]:
features.add(base_map_type) features.add(base_standard_type)
if details.get("derived_from_gloss"): features.add("InvertedGloss") if details.get("derived_from_gloss"): features.add("InvertedGloss")
# Check if any resolution was saved as 16-bit # Check if any resolution was saved as 16-bit
res_details = processed_maps_details_asset.get(map_type, {}) res_details = processed_maps_details_asset.get(map_type, {})
if any(res_info.get("bit_depth") == 16 for res_info in res_details.values() if isinstance(res_info, dict)): features.add(f"16bit_{base_map_type}") if any(res_info.get("bit_depth") == 16 for res_info in res_details.values() if isinstance(res_info, dict)):
features.add(f"16bit_{base_standard_type}")
# Check merged maps for 16-bit output # Check merged maps for 16-bit output
for map_type, res_dict in merged_maps_details_asset.items(): for map_type, res_dict in merged_maps_details_asset.items(): # map_type here is "NRMRGH"
base_map_type = _get_base_map_type(map_type) base_standard_type = self._get_base_map_type(map_type) # Should give "NRMRGH"
if any(res_info.get("bit_depth") == 16 for res_info in res_dict.values() if isinstance(res_info, dict)): features.add(f"16bit_{base_map_type}") if any(res_info.get("bit_depth") == 16 for res_info in res_dict.values() if isinstance(res_info, dict)):
features.add(f"16bit_{base_standard_type}")
final_metadata["shader_features"] = sorted(list(features)) final_metadata["shader_features"] = sorted(list(features))