16 bit processing fixes + code unification
This commit is contained in:
parent
415a3d64e8
commit
d9baa5c454
@ -39,7 +39,8 @@
|
|||||||
"get_conport_schema",
|
"get_conport_schema",
|
||||||
"get_recent_activity_summary",
|
"get_recent_activity_summary",
|
||||||
"semantic_search_conport",
|
"semantic_search_conport",
|
||||||
"search_system_patterns_fts"
|
"search_system_patterns_fts",
|
||||||
|
"update_decision"
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
105
ProjectNotes/bit_depth_refactor_plan.md
Normal file
105
ProjectNotes/bit_depth_refactor_plan.md
Normal file
@ -0,0 +1,105 @@
|
|||||||
|
# Bit Depth Terminology Refactoring Plan
|
||||||
|
|
||||||
|
## 1. Background
|
||||||
|
|
||||||
|
Currently, there's an inconsistency in how bit depth rules and settings are defined and used across the project:
|
||||||
|
|
||||||
|
* **`config/file_type_definitions.json`**: Uses `"bit_depth_rule"` with values like `"force_8bit"` and `"respect"`.
|
||||||
|
* **`config/app_settings.json`**: (Within `MAP_MERGE_RULES`) uses `"output_bit_depth"` with values like `"respect_inputs"`.
|
||||||
|
* **`processing/utils/image_saving_utils.py`**: Contains logic that attempts to handle `"respect_inputs"` but is currently unreachable, and the `"respect"` rule effectively defaults to 8-bit.
|
||||||
|
|
||||||
|
This plan aims to unify the terminology and correct the processing logic.
|
||||||
|
|
||||||
|
## 2. Proposed Unified Terminology
|
||||||
|
|
||||||
|
A new configuration key and a clear set of values will be adopted:
|
||||||
|
|
||||||
|
* **New Key**: `bit_depth_policy`
|
||||||
|
* This key will replace `"bit_depth_rule"` in `file_type_definitions.json`.
|
||||||
|
* This key will replace `"output_bit_depth"` in `app_settings.json` (for `MAP_MERGE_RULES`).
|
||||||
|
* **Values for `bit_depth_policy`**:
|
||||||
|
* `"force_8bit"`: Always output 8-bit.
|
||||||
|
* `"force_16bit"`: Always output 16-bit.
|
||||||
|
* `"preserve"`: If any source image (or any input to a merge operation) has a bit depth greater than 8-bit, the output will be 16-bit. Otherwise, the output will be 8-bit.
|
||||||
|
* `""` (empty string or `null`): No specific bit depth policy applies (e.g., for non-image files like models or text files).
|
||||||
|
|
||||||
|
## 3. Refactoring Plan Details
|
||||||
|
|
||||||
|
### Phase 1: Configuration File Updates
|
||||||
|
|
||||||
|
1. **`config/file_type_definitions.json`**:
|
||||||
|
* Rename all instances of the key `"bit_depth_rule"` to `"bit_depth_policy"`.
|
||||||
|
* Update values:
|
||||||
|
* `"force_8bit"` remains `"force_8bit"`.
|
||||||
|
* `"respect"` changes to `"preserve"`.
|
||||||
|
* `""` (empty string) remains `""`.
|
||||||
|
2. **`config/app_settings.json`**:
|
||||||
|
* Within each rule in the `MAP_MERGE_RULES` array, rename the key `"output_bit_depth"` to `"bit_depth_policy"`.
|
||||||
|
* Update the value: `"respect_inputs"` changes to `"preserve"`.
|
||||||
|
|
||||||
|
### Phase 2: Code Update - `configuration.py`
|
||||||
|
|
||||||
|
1. Modify the `Configuration` class:
|
||||||
|
* Rename the method `get_bit_depth_rule()` to `get_bit_depth_policy()`.
|
||||||
|
* Update this method to read the new `"bit_depth_policy"` key from the loaded file type definitions.
|
||||||
|
* Ensure it correctly handles and returns the new policy values (`"force_8bit"`, `"force_16bit"`, `"preserve"`, `""`).
|
||||||
|
* The method should continue to provide a sensible default if a map type is not found or has an invalid policy.
|
||||||
|
|
||||||
|
### Phase 3: Code Update - `processing/utils/image_saving_utils.py`
|
||||||
|
|
||||||
|
1. Refactor the `save_image_variants` function:
|
||||||
|
* It will receive the `bit_depth_policy` (e.g., `"preserve"`, `"force_8bit"`) via its `file_type_defs` argument (which originates from the `Configuration` object).
|
||||||
|
* Correct the internal logic for determining `target_bit_depth` based on the `bit_depth_policy` argument:
|
||||||
|
* If `bit_depth_policy == "force_8bit"`, then `target_bit_depth = 8`.
|
||||||
|
* If `bit_depth_policy == "force_16bit"`, then `target_bit_depth = 16`.
|
||||||
|
* If `bit_depth_policy == "preserve"`:
|
||||||
|
* Examine the `source_bit_depth_info` argument (list of bit depths of input images).
|
||||||
|
* If any source bit depth in `source_bit_depth_info` is greater than 8, then `target_bit_depth = 16`.
|
||||||
|
* Otherwise (all source bit depths are 8 or less, or list is empty/all None), `target_bit_depth = 8`.
|
||||||
|
* If `bit_depth_policy == ""` or is `null` (or any other unhandled value), a clear default behavior should be established (e.g., log a warning and default to `"preserve"` or skip bit depth adjustments if appropriate for the file type).
|
||||||
|
|
||||||
|
### Phase 4: Code Update - `processing/pipeline/stages/merged_task_processor.py`
|
||||||
|
|
||||||
|
1. This stage is largely unaffected in its core logic for collecting `input_source_bit_depths`.
|
||||||
|
2. The `ProcessedMergedMapData` object it produces will continue to carry these `source_bit_depths`.
|
||||||
|
3. When this data is later passed to the `SaveVariantsStage` (and subsequently to `save_image_variants`), the `internal_map_type` of the merged map (e.g., "MAP_NRMRGH") will be used. The `Configuration` object will provide its `bit_depth_policy` (which, after refactoring `file_type_definitions.json`, should be `"preserve"` for relevant merged maps).
|
||||||
|
4. The refactored `save_image_variants` will then use this `"preserve"` policy along with the `source_bit_depth_info` (derived from the merge inputs) to correctly determine the output bit depth for the merged map.
|
||||||
|
|
||||||
|
### Phase 5: Review Other Code & Potential Impacts
|
||||||
|
|
||||||
|
1. Conduct a codebase search for any remaining direct usages of the old keys (`"bit_depth_rule"`, `"output_bit_depth"`) or their values.
|
||||||
|
2. Update these locations to use the new `Configuration.get_bit_depth_policy()` method and the new `"bit_depth_policy"` key and values.
|
||||||
|
3. Pay special attention to any prediction logic (e.g., in `gui/prediction_handler.py` or `gui/llm_prediction_handler.py`) if it currently considers or tries to infer bit depth rules.
|
||||||
|
|
||||||
|
## 4. Backward Compatibility & Migration
|
||||||
|
|
||||||
|
* This is a breaking change for existing user-customized configuration files (`file_type_definitions.json`, `app_settings.json`, and any custom presets).
|
||||||
|
* **Recommended Approach**: Implement migration logic within the `Configuration` class's loading methods.
|
||||||
|
* When loading `file_type_definitions.json`: If `"bit_depth_rule"` is found, convert its value (e.g., `"respect"` to `"preserve"`) and store it under the new `"bit_depth_policy"` key. Log a warning.
|
||||||
|
* When loading `app_settings.json` (specifically `MAP_MERGE_RULES`): If `"output_bit_depth"` is found, convert its value (e.g., `"respect_inputs"` to `"preserve"`) and store it under `"bit_depth_policy"`. Log a warning.
|
||||||
|
* This ensures the application can still function with older user configs while guiding users to update.
|
||||||
|
|
||||||
|
## 5. Visualized Logic for `save_image_variants` (Post-Refactor)
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
graph TD
|
||||||
|
A[Start save_image_variants] --> B{Get bit_depth_policy for base_map_type};
|
||||||
|
B --> C{Policy is "force_8bit"?};
|
||||||
|
C -- Yes --> D[target_bit_depth = 8];
|
||||||
|
C -- No --> E{Policy is "force_16bit"?};
|
||||||
|
E -- Yes --> F[target_bit_depth = 16];
|
||||||
|
E -- No --> G{Policy is "preserve"?};
|
||||||
|
G -- Yes --> H{Any source_bit_depth_info > 8?};
|
||||||
|
H -- Yes --> I[target_bit_depth = 16];
|
||||||
|
H -- No --> J[target_bit_depth = 8];
|
||||||
|
G -- No --> K[Log warning: Unknown policy or "" policy, default to 8-bit or handle as per type];
|
||||||
|
K --> D;
|
||||||
|
D --> L[Proceed to save with 8-bit];
|
||||||
|
F --> M[Proceed to save with 16-bit];
|
||||||
|
I --> M;
|
||||||
|
J --> L;
|
||||||
|
L --> Z[End];
|
||||||
|
M --> Z;
|
||||||
|
```
|
||||||
|
|
||||||
|
This plan aims to create a more consistent, understandable, and correctly functioning system for handling bit depth across the application.
|
||||||
@ -1,7 +1,7 @@
|
|||||||
{
|
{
|
||||||
"FILE_TYPE_DEFINITIONS": {
|
"FILE_TYPE_DEFINITIONS": {
|
||||||
"MAP_COL": {
|
"MAP_COL": {
|
||||||
"bit_depth_rule": "force_8bit",
|
"bit_depth_policy": "force_8bit",
|
||||||
"color": "#ffaa00",
|
"color": "#ffaa00",
|
||||||
"description": "Color/Albedo Map",
|
"description": "Color/Albedo Map",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -15,7 +15,7 @@
|
|||||||
"standard_type": "COL"
|
"standard_type": "COL"
|
||||||
},
|
},
|
||||||
"MAP_NRM": {
|
"MAP_NRM": {
|
||||||
"bit_depth_rule": "respect",
|
"bit_depth_policy": "preserve",
|
||||||
"color": "#cca2f1",
|
"color": "#cca2f1",
|
||||||
"description": "Normal Map",
|
"description": "Normal Map",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -26,8 +26,17 @@
|
|||||||
"keybind": "N",
|
"keybind": "N",
|
||||||
"standard_type": "NRM"
|
"standard_type": "NRM"
|
||||||
},
|
},
|
||||||
|
"MAP_NRMRGH": {
|
||||||
|
"bit_depth_policy": "preserve",
|
||||||
|
"color": "#abcdef",
|
||||||
|
"description": "Normal + Roughness Merged Map",
|
||||||
|
"examples": [],
|
||||||
|
"is_grayscale": false,
|
||||||
|
"keybind": "",
|
||||||
|
"standard_type": "NRMRGH"
|
||||||
|
},
|
||||||
"MAP_METAL": {
|
"MAP_METAL": {
|
||||||
"bit_depth_rule": "force_8bit",
|
"bit_depth_policy": "force_8bit",
|
||||||
"color": "#dcf4f2",
|
"color": "#dcf4f2",
|
||||||
"description": "Metalness Map",
|
"description": "Metalness Map",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -39,7 +48,7 @@
|
|||||||
"standard_type": "METAL"
|
"standard_type": "METAL"
|
||||||
},
|
},
|
||||||
"MAP_ROUGH": {
|
"MAP_ROUGH": {
|
||||||
"bit_depth_rule": "force_8bit",
|
"bit_depth_policy": "force_8bit",
|
||||||
"color": "#bfd6bf",
|
"color": "#bfd6bf",
|
||||||
"description": "Roughness Map",
|
"description": "Roughness Map",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -52,7 +61,7 @@
|
|||||||
"standard_type": "ROUGH"
|
"standard_type": "ROUGH"
|
||||||
},
|
},
|
||||||
"MAP_GLOSS": {
|
"MAP_GLOSS": {
|
||||||
"bit_depth_rule": "force_8bit",
|
"bit_depth_policy": "force_8bit",
|
||||||
"color": "#d6bfd6",
|
"color": "#d6bfd6",
|
||||||
"description": "Glossiness Map",
|
"description": "Glossiness Map",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -64,7 +73,7 @@
|
|||||||
"standard_type": "GLOSS"
|
"standard_type": "GLOSS"
|
||||||
},
|
},
|
||||||
"MAP_AO": {
|
"MAP_AO": {
|
||||||
"bit_depth_rule": "force_8bit",
|
"bit_depth_policy": "force_8bit",
|
||||||
"color": "#e3c7c7",
|
"color": "#e3c7c7",
|
||||||
"description": "Ambient Occlusion Map",
|
"description": "Ambient Occlusion Map",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -76,7 +85,7 @@
|
|||||||
"standard_type": "AO"
|
"standard_type": "AO"
|
||||||
},
|
},
|
||||||
"MAP_DISP": {
|
"MAP_DISP": {
|
||||||
"bit_depth_rule": "respect",
|
"bit_depth_policy": "preserve",
|
||||||
"color": "#c6ddd5",
|
"color": "#c6ddd5",
|
||||||
"description": "Displacement/Height Map",
|
"description": "Displacement/Height Map",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -88,7 +97,7 @@
|
|||||||
"standard_type": "DISP"
|
"standard_type": "DISP"
|
||||||
},
|
},
|
||||||
"MAP_REFL": {
|
"MAP_REFL": {
|
||||||
"bit_depth_rule": "force_8bit",
|
"bit_depth_policy": "force_8bit",
|
||||||
"color": "#c2c2b9",
|
"color": "#c2c2b9",
|
||||||
"description": "Reflection/Specular Map",
|
"description": "Reflection/Specular Map",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -100,7 +109,7 @@
|
|||||||
"standard_type": "REFL"
|
"standard_type": "REFL"
|
||||||
},
|
},
|
||||||
"MAP_SSS": {
|
"MAP_SSS": {
|
||||||
"bit_depth_rule": "respect",
|
"bit_depth_policy": "preserve",
|
||||||
"color": "#a0d394",
|
"color": "#a0d394",
|
||||||
"description": "Subsurface Scattering Map",
|
"description": "Subsurface Scattering Map",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -112,7 +121,7 @@
|
|||||||
"standard_type": "SSS"
|
"standard_type": "SSS"
|
||||||
},
|
},
|
||||||
"MAP_FUZZ": {
|
"MAP_FUZZ": {
|
||||||
"bit_depth_rule": "force_8bit",
|
"bit_depth_policy": "force_8bit",
|
||||||
"color": "#a2d1da",
|
"color": "#a2d1da",
|
||||||
"description": "Fuzz/Sheen Map",
|
"description": "Fuzz/Sheen Map",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -124,7 +133,7 @@
|
|||||||
"standard_type": "FUZZ"
|
"standard_type": "FUZZ"
|
||||||
},
|
},
|
||||||
"MAP_IDMAP": {
|
"MAP_IDMAP": {
|
||||||
"bit_depth_rule": "force_8bit",
|
"bit_depth_policy": "force_8bit",
|
||||||
"color": "#ca8fb4",
|
"color": "#ca8fb4",
|
||||||
"description": "ID Map (for masking)",
|
"description": "ID Map (for masking)",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -136,7 +145,7 @@
|
|||||||
"standard_type": "IDMAP"
|
"standard_type": "IDMAP"
|
||||||
},
|
},
|
||||||
"MAP_MASK": {
|
"MAP_MASK": {
|
||||||
"bit_depth_rule": "force_8bit",
|
"bit_depth_policy": "force_8bit",
|
||||||
"color": "#c6e2bf",
|
"color": "#c6e2bf",
|
||||||
"description": "Generic Mask Map",
|
"description": "Generic Mask Map",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -147,7 +156,7 @@
|
|||||||
"standard_type": "MASK"
|
"standard_type": "MASK"
|
||||||
},
|
},
|
||||||
"MAP_IMPERFECTION": {
|
"MAP_IMPERFECTION": {
|
||||||
"bit_depth_rule": "force_8bit",
|
"bit_depth_policy": "force_8bit",
|
||||||
"color": "#e6d1a6",
|
"color": "#e6d1a6",
|
||||||
"description": "Imperfection Map (scratches, dust)",
|
"description": "Imperfection Map (scratches, dust)",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -164,7 +173,7 @@
|
|||||||
"standard_type": "IMPERFECTION"
|
"standard_type": "IMPERFECTION"
|
||||||
},
|
},
|
||||||
"MODEL": {
|
"MODEL": {
|
||||||
"bit_depth_rule": "",
|
"bit_depth_policy": "",
|
||||||
"color": "#3db2bd",
|
"color": "#3db2bd",
|
||||||
"description": "3D Model File",
|
"description": "3D Model File",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -176,7 +185,7 @@
|
|||||||
"standard_type": ""
|
"standard_type": ""
|
||||||
},
|
},
|
||||||
"EXTRA": {
|
"EXTRA": {
|
||||||
"bit_depth_rule": "",
|
"bit_depth_policy": "",
|
||||||
"color": "#8c8c8c",
|
"color": "#8c8c8c",
|
||||||
"description": "asset previews or metadata",
|
"description": "asset previews or metadata",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -193,7 +202,7 @@
|
|||||||
"standard_type": "EXTRA"
|
"standard_type": "EXTRA"
|
||||||
},
|
},
|
||||||
"FILE_IGNORE": {
|
"FILE_IGNORE": {
|
||||||
"bit_depth_rule": "",
|
"bit_depth_policy": "",
|
||||||
"color": "#673d35",
|
"color": "#673d35",
|
||||||
"description": "File identified to be ignored due to prioritization rules (e.g., a lower bit-depth version when a higher one is present).",
|
"description": "File identified to be ignored due to prioritization rules (e.g., a lower bit-depth version when a higher one is present).",
|
||||||
"category": "Ignored",
|
"category": "Ignored",
|
||||||
|
|||||||
@ -37,7 +37,7 @@
|
|||||||
"G": 0.5,
|
"G": 0.5,
|
||||||
"B": 0.5
|
"B": 0.5
|
||||||
},
|
},
|
||||||
"output_bit_depth": "respect_inputs"
|
"bit_depth_policy": "preserve"
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"CALCULATE_STATS_RESOLUTION": "1K",
|
"CALCULATE_STATS_RESOLUTION": "1K",
|
||||||
|
|||||||
@ -1,7 +1,7 @@
|
|||||||
{
|
{
|
||||||
"FILE_TYPE_DEFINITIONS": {
|
"FILE_TYPE_DEFINITIONS": {
|
||||||
"MAP_COL": {
|
"MAP_COL": {
|
||||||
"bit_depth_rule": "force_8bit",
|
"bit_depth_policy": "force_8bit",
|
||||||
"color": "#ffaa00",
|
"color": "#ffaa00",
|
||||||
"description": "Color/Albedo Map",
|
"description": "Color/Albedo Map",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -15,7 +15,7 @@
|
|||||||
"standard_type": "COL"
|
"standard_type": "COL"
|
||||||
},
|
},
|
||||||
"MAP_NRM": {
|
"MAP_NRM": {
|
||||||
"bit_depth_rule": "respect",
|
"bit_depth_policy": "preserve",
|
||||||
"color": "#cca2f1",
|
"color": "#cca2f1",
|
||||||
"description": "Normal Map",
|
"description": "Normal Map",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -27,7 +27,7 @@
|
|||||||
"standard_type": "NRM"
|
"standard_type": "NRM"
|
||||||
},
|
},
|
||||||
"MAP_METAL": {
|
"MAP_METAL": {
|
||||||
"bit_depth_rule": "force_8bit",
|
"bit_depth_policy": "force_8bit",
|
||||||
"color": "#dcf4f2",
|
"color": "#dcf4f2",
|
||||||
"description": "Metalness Map",
|
"description": "Metalness Map",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -39,7 +39,7 @@
|
|||||||
"standard_type": "METAL"
|
"standard_type": "METAL"
|
||||||
},
|
},
|
||||||
"MAP_ROUGH": {
|
"MAP_ROUGH": {
|
||||||
"bit_depth_rule": "force_8bit",
|
"bit_depth_policy": "force_8bit",
|
||||||
"color": "#bfd6bf",
|
"color": "#bfd6bf",
|
||||||
"description": "Roughness Map",
|
"description": "Roughness Map",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -52,7 +52,7 @@
|
|||||||
"standard_type": "ROUGH"
|
"standard_type": "ROUGH"
|
||||||
},
|
},
|
||||||
"MAP_GLOSS": {
|
"MAP_GLOSS": {
|
||||||
"bit_depth_rule": "force_8bit",
|
"bit_depth_policy": "force_8bit",
|
||||||
"color": "#d6bfd6",
|
"color": "#d6bfd6",
|
||||||
"description": "Glossiness Map",
|
"description": "Glossiness Map",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -64,7 +64,7 @@
|
|||||||
"standard_type": "GLOSS"
|
"standard_type": "GLOSS"
|
||||||
},
|
},
|
||||||
"MAP_AO": {
|
"MAP_AO": {
|
||||||
"bit_depth_rule": "force_8bit",
|
"bit_depth_policy": "force_8bit",
|
||||||
"color": "#e3c7c7",
|
"color": "#e3c7c7",
|
||||||
"description": "Ambient Occlusion Map",
|
"description": "Ambient Occlusion Map",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -76,7 +76,7 @@
|
|||||||
"standard_type": "AO"
|
"standard_type": "AO"
|
||||||
},
|
},
|
||||||
"MAP_DISP": {
|
"MAP_DISP": {
|
||||||
"bit_depth_rule": "respect",
|
"bit_depth_policy": "preserve",
|
||||||
"color": "#c6ddd5",
|
"color": "#c6ddd5",
|
||||||
"description": "Displacement/Height Map",
|
"description": "Displacement/Height Map",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -88,7 +88,7 @@
|
|||||||
"standard_type": "DISP"
|
"standard_type": "DISP"
|
||||||
},
|
},
|
||||||
"MAP_REFL": {
|
"MAP_REFL": {
|
||||||
"bit_depth_rule": "force_8bit",
|
"bit_depth_policy": "force_8bit",
|
||||||
"color": "#c2c2b9",
|
"color": "#c2c2b9",
|
||||||
"description": "Reflection/Specular Map",
|
"description": "Reflection/Specular Map",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -100,7 +100,7 @@
|
|||||||
"standard_type": "REFL"
|
"standard_type": "REFL"
|
||||||
},
|
},
|
||||||
"MAP_SSS": {
|
"MAP_SSS": {
|
||||||
"bit_depth_rule": "respect",
|
"bit_depth_policy": "preserve",
|
||||||
"color": "#a0d394",
|
"color": "#a0d394",
|
||||||
"description": "Subsurface Scattering Map",
|
"description": "Subsurface Scattering Map",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -112,7 +112,7 @@
|
|||||||
"standard_type": "SSS"
|
"standard_type": "SSS"
|
||||||
},
|
},
|
||||||
"MAP_FUZZ": {
|
"MAP_FUZZ": {
|
||||||
"bit_depth_rule": "force_8bit",
|
"bit_depth_policy": "force_8bit",
|
||||||
"color": "#a2d1da",
|
"color": "#a2d1da",
|
||||||
"description": "Fuzz/Sheen Map",
|
"description": "Fuzz/Sheen Map",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -124,7 +124,7 @@
|
|||||||
"standard_type": "FUZZ"
|
"standard_type": "FUZZ"
|
||||||
},
|
},
|
||||||
"MAP_IDMAP": {
|
"MAP_IDMAP": {
|
||||||
"bit_depth_rule": "force_8bit",
|
"bit_depth_policy": "force_8bit",
|
||||||
"color": "#ca8fb4",
|
"color": "#ca8fb4",
|
||||||
"description": "ID Map (for masking)",
|
"description": "ID Map (for masking)",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -136,7 +136,7 @@
|
|||||||
"standard_type": "IDMAP"
|
"standard_type": "IDMAP"
|
||||||
},
|
},
|
||||||
"MAP_MASK": {
|
"MAP_MASK": {
|
||||||
"bit_depth_rule": "force_8bit",
|
"bit_depth_policy": "force_8bit",
|
||||||
"color": "#c6e2bf",
|
"color": "#c6e2bf",
|
||||||
"description": "Generic Mask Map",
|
"description": "Generic Mask Map",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -146,8 +146,19 @@
|
|||||||
"keybind": "",
|
"keybind": "",
|
||||||
"standard_type": "MASK"
|
"standard_type": "MASK"
|
||||||
},
|
},
|
||||||
|
"MAP_NRMRGH": {
|
||||||
|
"bit_depth_policy": "preserve",
|
||||||
|
"color": "#abcdef",
|
||||||
|
"description": "Packed Normal + Roughness + Metallic Map",
|
||||||
|
"examples": [
|
||||||
|
"_nrmrgh."
|
||||||
|
],
|
||||||
|
"is_grayscale": false,
|
||||||
|
"keybind": "",
|
||||||
|
"standard_type": "NRMRGH"
|
||||||
|
},
|
||||||
"MAP_IMPERFECTION": {
|
"MAP_IMPERFECTION": {
|
||||||
"bit_depth_rule": "force_8bit",
|
"bit_depth_policy": "force_8bit",
|
||||||
"color": "#e6d1a6",
|
"color": "#e6d1a6",
|
||||||
"description": "Imperfection Map (scratches, dust)",
|
"description": "Imperfection Map (scratches, dust)",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -164,7 +175,7 @@
|
|||||||
"standard_type": "IMPERFECTION"
|
"standard_type": "IMPERFECTION"
|
||||||
},
|
},
|
||||||
"MODEL": {
|
"MODEL": {
|
||||||
"bit_depth_rule": "",
|
"bit_depth_policy": "",
|
||||||
"color": "#3db2bd",
|
"color": "#3db2bd",
|
||||||
"description": "3D Model File",
|
"description": "3D Model File",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -176,7 +187,7 @@
|
|||||||
"standard_type": ""
|
"standard_type": ""
|
||||||
},
|
},
|
||||||
"EXTRA": {
|
"EXTRA": {
|
||||||
"bit_depth_rule": "",
|
"bit_depth_policy": "",
|
||||||
"color": "#8c8c8c",
|
"color": "#8c8c8c",
|
||||||
"description": "asset previews or metadata",
|
"description": "asset previews or metadata",
|
||||||
"examples": [
|
"examples": [
|
||||||
@ -193,7 +204,7 @@
|
|||||||
"standard_type": "EXTRA"
|
"standard_type": "EXTRA"
|
||||||
},
|
},
|
||||||
"FILE_IGNORE": {
|
"FILE_IGNORE": {
|
||||||
"bit_depth_rule": "",
|
"bit_depth_policy": "",
|
||||||
"color": "#673d35",
|
"color": "#673d35",
|
||||||
"description": "File identified to be ignored due to prioritization rules (e.g., a lower bit-depth version when a higher one is present).",
|
"description": "File identified to be ignored due to prioritization rules (e.g., a lower bit-depth version when a higher one is present).",
|
||||||
"category": "Ignored",
|
"category": "Ignored",
|
||||||
|
|||||||
@ -177,6 +177,40 @@ class Configuration:
|
|||||||
self._file_type_definitions: dict = self._load_definition_file_with_fallback(
|
self._file_type_definitions: dict = self._load_definition_file_with_fallback(
|
||||||
self.FILE_TYPE_DEFINITIONS_FILENAME, "FILE_TYPE_DEFINITIONS"
|
self.FILE_TYPE_DEFINITIONS_FILENAME, "FILE_TYPE_DEFINITIONS"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
# --- Migration Logic for file_type_definitions.json ---
|
||||||
|
# Moved from _load_definition_file_with_fallback to ensure execution
|
||||||
|
if isinstance(self._file_type_definitions, dict):
|
||||||
|
log.debug(f"Applying migration logic for old bit depth terminology in {self.FILE_TYPE_DEFINITIONS_FILENAME}")
|
||||||
|
for map_type_key, definition in self._file_type_definitions.items():
|
||||||
|
if isinstance(definition, dict):
|
||||||
|
# Check for old key "bit_depth_rule"
|
||||||
|
if "bit_depth_rule" in definition:
|
||||||
|
old_rule = definition.pop("bit_depth_rule") # Remove old key
|
||||||
|
new_policy = old_rule # Start with the old value
|
||||||
|
if old_rule == "respect":
|
||||||
|
new_policy = "preserve" # Map old value to new
|
||||||
|
elif old_rule == "respect_inputs":
|
||||||
|
new_policy = "preserve" # Map old value to new (though this shouldn't be in FTD)
|
||||||
|
elif old_rule == "":
|
||||||
|
new_policy = "" # Keep empty string
|
||||||
|
# "force_8bit" and "force_16bit" values remain the same
|
||||||
|
|
||||||
|
definition["bit_depth_policy"] = new_policy # Add new key with migrated value
|
||||||
|
log.warning(f"Migrated old 'bit_depth_rule': '{old_rule}' to 'bit_depth_policy': '{new_policy}' for map type '{map_type_key}' in {self.FILE_TYPE_DEFINITIONS_FILENAME}. Please update your configuration file.")
|
||||||
|
|
||||||
|
# Also check for old value "respect" under the new key, in case the key was manually renamed but value wasn't
|
||||||
|
if "bit_depth_policy" in definition and definition["bit_depth_policy"] == "respect":
|
||||||
|
definition["bit_depth_policy"] = "preserve"
|
||||||
|
log.warning(f"Migrated old 'bit_depth_policy' value 'respect' to 'preserve' for map type '{map_type_key}' in {self.FILE_TYPE_DEFINITIONS_FILENAME}. Please update your configuration file.")
|
||||||
|
|
||||||
|
# --- Migration Logic for app_settings.json (MAP_MERGE_RULES) ---
|
||||||
|
# This needs to happen after core settings are loaded and potentially merged with user settings,
|
||||||
|
# so it might be better placed in __init__ after the merge, or in a dedicated method called by __init__.
|
||||||
|
# For now, let's focus on the file_type_definitions.json issue causing the autotest warnings.
|
||||||
|
# The app_settings.json migration can be a separate step if needed, but the primary issue
|
||||||
|
# seems to be with file_type_definitions.json loading in the test context.
|
||||||
|
|
||||||
self._llm_settings: dict = self._load_definition_file_with_fallback(
|
self._llm_settings: dict = self._load_definition_file_with_fallback(
|
||||||
self.LLM_SETTINGS_FILENAME, None # LLM settings might be flat (no root key)
|
self.LLM_SETTINGS_FILENAME, None # LLM settings might be flat (no root key)
|
||||||
)
|
)
|
||||||
@ -325,6 +359,8 @@ class Configuration:
|
|||||||
return content
|
return content
|
||||||
return data # For flat files
|
return data # For flat files
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
def _load_preset_with_fallback(self, preset_name_stem: str) -> dict:
|
def _load_preset_with_fallback(self, preset_name_stem: str) -> dict:
|
||||||
"""
|
"""
|
||||||
Loads a preset JSON file from the user's Presets subdir.
|
Loads a preset JSON file from the user's Presets subdir.
|
||||||
@ -513,8 +549,6 @@ class Configuration:
|
|||||||
if not isinstance(prio_keyword, str):
|
if not isinstance(prio_keyword, str):
|
||||||
raise ConfigurationError(f"Preset file '{self._preset_filename_stem}.json': Priority keyword at index {prio_kw_index} in rule {index} ('{rule['target_type']}') must be a string.")
|
raise ConfigurationError(f"Preset file '{self._preset_filename_stem}.json': Priority keyword at index {prio_kw_index} in rule {index} ('{rule['target_type']}') must be a string.")
|
||||||
|
|
||||||
if not isinstance(self._core_settings.get('TARGET_FILENAME_PATTERN'), str):
|
|
||||||
raise ConfigurationError("Core config 'TARGET_FILENAME_PATTERN' must be a string.")
|
|
||||||
if not isinstance(self._core_settings.get('OUTPUT_DIRECTORY_PATTERN'), str):
|
if not isinstance(self._core_settings.get('OUTPUT_DIRECTORY_PATTERN'), str):
|
||||||
raise ConfigurationError("Core config 'OUTPUT_DIRECTORY_PATTERN' must be a string.")
|
raise ConfigurationError("Core config 'OUTPUT_DIRECTORY_PATTERN' must be a string.")
|
||||||
if not isinstance(self._core_settings.get('OUTPUT_FILENAME_PATTERN'), str):
|
if not isinstance(self._core_settings.get('OUTPUT_FILENAME_PATTERN'), str):
|
||||||
@ -691,26 +725,27 @@ class Configuration:
|
|||||||
"""Gets the list of map types that must always be saved losslessly."""
|
"""Gets the list of map types that must always be saved losslessly."""
|
||||||
return self._core_settings.get('FORCE_LOSSLESS_MAP_TYPES', [])
|
return self._core_settings.get('FORCE_LOSSLESS_MAP_TYPES', [])
|
||||||
|
|
||||||
def get_bit_depth_rule(self, map_type_input: str) -> str:
|
def get_bit_depth_policy(self, map_type_input: str) -> str:
|
||||||
"""
|
"""
|
||||||
Gets the bit depth rule ('respect', 'force_8bit', 'force_16bit') for a given map type identifier.
|
Gets the bit depth policy ('force_8bit', 'force_16bit', 'preserve', '') for a given map type identifier.
|
||||||
The map_type_input can be an FTD key (e.g., "MAP_COL") or a suffixed FTD key (e.g., "MAP_COL-1").
|
The map_type_input can be an FTD key (e.g., "MAP_COL") or a suffixed FTD key (e.g., "MAP_COL-1").
|
||||||
"""
|
"""
|
||||||
if not self._file_type_definitions: # Check if the attribute exists and is not empty
|
if not self._file_type_definitions: # Check if the attribute exists and is not empty
|
||||||
log.warning("File type definitions not loaded. Cannot determine bit depth rule.")
|
log.warning("File type definitions not loaded. Cannot determine bit depth policy.")
|
||||||
return "respect"
|
return "preserve" # Defaulting to 'preserve' as per refactor plan Phase 1 completion
|
||||||
|
|
||||||
file_type_definitions = self._file_type_definitions
|
file_type_definitions = self._file_type_definitions
|
||||||
|
|
||||||
# 1. Try direct match with map_type_input as FTD key
|
# 1. Try direct match with map_type_input as FTD key
|
||||||
definition = file_type_definitions.get(map_type_input)
|
definition = file_type_definitions.get(map_type_input)
|
||||||
if definition:
|
if definition:
|
||||||
rule = definition.get('bit_depth_rule')
|
policy = definition.get('bit_depth_policy')
|
||||||
if rule in ['respect', 'force_8bit', 'force_16bit']:
|
# Valid policies include the empty string
|
||||||
return rule
|
if policy in ['force_8bit', 'force_16bit', 'preserve', '']:
|
||||||
|
return policy
|
||||||
else:
|
else:
|
||||||
log.warning(f"FTD key '{map_type_input}' found, but 'bit_depth_rule' is missing or invalid: '{rule}'. Defaulting to 'respect'.")
|
log.warning(f"FTD key '{map_type_input}' found, but 'bit_depth_policy' is missing or invalid: '{policy}'. Defaulting to 'preserve'.")
|
||||||
return "respect"
|
return "preserve"
|
||||||
|
|
||||||
# 2. Try to derive base FTD key by stripping common variant suffixes
|
# 2. Try to derive base FTD key by stripping common variant suffixes
|
||||||
# Regex to remove trailing suffixes like -<digits>, -<alphanum>, _<alphanum>
|
# Regex to remove trailing suffixes like -<digits>, -<alphanum>, _<alphanum>
|
||||||
@ -718,17 +753,17 @@ class Configuration:
|
|||||||
if base_ftd_key_candidate != map_type_input:
|
if base_ftd_key_candidate != map_type_input:
|
||||||
definition = file_type_definitions.get(base_ftd_key_candidate)
|
definition = file_type_definitions.get(base_ftd_key_candidate)
|
||||||
if definition:
|
if definition:
|
||||||
rule = definition.get('bit_depth_rule')
|
policy = definition.get('bit_depth_policy')
|
||||||
if rule in ['respect', 'force_8bit', 'force_16bit']:
|
if policy in ['force_8bit', 'force_16bit', 'preserve', '']:
|
||||||
log.debug(f"Derived base FTD key '{base_ftd_key_candidate}' from '{map_type_input}' and found bit depth rule: {rule}")
|
log.debug(f"Derived base FTD key '{base_ftd_key_candidate}' from '{map_type_input}' and found bit depth policy: {policy}")
|
||||||
return rule
|
return policy
|
||||||
else:
|
else:
|
||||||
log.warning(f"Derived base FTD key '{base_ftd_key_candidate}' from '{map_type_input}', but 'bit_depth_rule' is missing/invalid: '{rule}'. Defaulting to 'respect'.")
|
log.warning(f"Derived base FTD key '{base_ftd_key_candidate}' from '{map_type_input}', but 'bit_depth_policy' is missing/invalid: '{policy}'. Defaulting to 'preserve'.")
|
||||||
return "respect"
|
return "preserve"
|
||||||
|
|
||||||
# If no match found after trying direct and derived keys
|
# If no match found after trying direct and derived keys
|
||||||
log.warning(f"Map type identifier '{map_type_input}' (or its derived base) not found in FILE_TYPE_DEFINITIONS. Defaulting bit depth rule to 'respect'.")
|
log.warning(f"Map type identifier '{map_type_input}' (or its derived base) not found in FILE_TYPE_DEFINITIONS. Defaulting bit depth policy to 'preserve'.")
|
||||||
return "respect"
|
return "preserve"
|
||||||
|
|
||||||
def get_16bit_output_formats(self) -> tuple[str, str]:
|
def get_16bit_output_formats(self) -> tuple[str, str]:
|
||||||
"""Gets the primary and fallback format names for 16-bit output."""
|
"""Gets the primary and fallback format names for 16-bit output."""
|
||||||
|
|||||||
BIN
context_portal/conport_vector_data/chroma.sqlite3
(Stored with Git LFS)
BIN
context_portal/conport_vector_data/chroma.sqlite3
(Stored with Git LFS)
Binary file not shown.
BIN
context_portal/context.db
(Stored with Git LFS)
BIN
context_portal/context.db
(Stored with Git LFS)
Binary file not shown.
@ -973,26 +973,26 @@ class ConfigEditorDialog(QDialog):
|
|||||||
self.merge_rule_details_layout.addRow(group)
|
self.merge_rule_details_layout.addRow(group)
|
||||||
self.merge_rule_widgets["defaults_table"] = defaults_table
|
self.merge_rule_widgets["defaults_table"] = defaults_table
|
||||||
|
|
||||||
|
# bit_depth_policy: QComboBox (Options: "preserve", "force_8bit", "force_16bit"). Label: "Bit Depth Policy".
|
||||||
# output_bit_depth: QComboBox (Options: "respect_inputs", "force_8bit", "force_16bit"). Label: "Output Bit Depth".
|
if "bit_depth_policy" in rule_data:
|
||||||
if "output_bit_depth" in rule_data:
|
label = QLabel("Bit Depth Policy:")
|
||||||
label = QLabel("Output Bit Depth:")
|
|
||||||
widget = QComboBox()
|
widget = QComboBox()
|
||||||
options = ["respect_inputs", "force_8bit", "force_16bit"]
|
options = ["preserve", "force_8bit", "force_16bit"]
|
||||||
widget.addItems(options)
|
widget.addItems(options)
|
||||||
if rule_data["output_bit_depth"] in options:
|
if rule_data["bit_depth_policy"] in options:
|
||||||
widget.setCurrentText(rule_data["output_bit_depth"])
|
widget.setCurrentText(rule_data["bit_depth_policy"])
|
||||||
self.merge_rule_details_layout.addRow(label, widget)
|
self.merge_rule_details_layout.addRow(label, widget)
|
||||||
self.merge_rule_widgets["output_bit_depth"] = widget
|
self.merge_rule_widgets["bit_depth_policy"] = widget
|
||||||
|
|
||||||
# Add stretch to push widgets to the top
|
|
||||||
self.merge_rule_details_layout.addStretch()
|
|
||||||
|
|
||||||
|
|
||||||
# Connect output_bit_depth QComboBox to update rule data
|
# Add stretch to push widgets to the top
|
||||||
if "output_bit_depth" in self.merge_rule_widgets and isinstance(self.merge_rule_widgets["output_bit_depth"], QComboBox):
|
self.merge_rule_details_layout.addStretch()
|
||||||
self.merge_rule_widgets["output_bit_depth"].currentTextChanged.connect(
|
|
||||||
lambda text, key="output_bit_depth": self.update_rule_data_simple_field(text, key)
|
|
||||||
|
# Connect bit_depth_policy QComboBox to update rule data
|
||||||
|
if "bit_depth_policy" in self.merge_rule_widgets and isinstance(self.merge_rule_widgets["bit_depth_policy"], QComboBox):
|
||||||
|
self.merge_rule_widgets["bit_depth_policy"].currentTextChanged.connect(
|
||||||
|
lambda text, key="bit_depth_policy": self.update_rule_data_simple_field(text, key)
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
@ -1107,7 +1107,7 @@ class ConfigEditorDialog(QDialog):
|
|||||||
"output_map_type": "NEW_RULE",
|
"output_map_type": "NEW_RULE",
|
||||||
"inputs": {"R": "", "G": "", "B": "", "A": ""},
|
"inputs": {"R": "", "G": "", "B": "", "A": ""},
|
||||||
"defaults": {"R": 0.0, "G": 0.0, "B": 0.0, "A": 1.0},
|
"defaults": {"R": 0.0, "G": 0.0, "B": 0.0, "A": 1.0},
|
||||||
"output_bit_depth": "respect_inputs"
|
"bit_depth_policy": "preserve"
|
||||||
}
|
}
|
||||||
|
|
||||||
# Add to the internal list that backs the UI
|
# Add to the internal list that backs the UI
|
||||||
@ -1417,8 +1417,9 @@ class ConfigEditorDialog(QDialog):
|
|||||||
self.widgets["RESOLUTION_THRESHOLD_FOR_JPG"].setCurrentText(current_text_selection)
|
self.widgets["RESOLUTION_THRESHOLD_FOR_JPG"].setCurrentText(current_text_selection)
|
||||||
|
|
||||||
|
|
||||||
elif key == "MAP_BIT_DEPTH_RULES" and "MAP_BIT_DEPTH_RULES_TABLE" in self.widgets:
|
# The MAP_BIT_DEPTH_RULES table is removed as per refactoring plan.
|
||||||
self.populate_map_bit_depth_rules_table(self.widgets["MAP_BIT_DEPTH_RULES_TABLE"], value)
|
# elif key == "MAP_BIT_DEPTH_RULES" and "MAP_BIT_DEPTH_RULES_TABLE" in self.widgets:
|
||||||
|
# self.populate_map_bit_depth_rules_table(self.widgets["MAP_BIT_DEPTH_RULES_TABLE"], value)
|
||||||
|
|
||||||
|
|
||||||
elif key == "MAP_MERGE_RULES" and hasattr(self, 'merge_rules_list'): # Check if the list widget exists
|
elif key == "MAP_MERGE_RULES" and hasattr(self, 'merge_rules_list'): # Check if the list widget exists
|
||||||
@ -1492,10 +1493,10 @@ class ConfigEditorDialog(QDialog):
|
|||||||
item_standard_type = QTableWidgetItem(standard_type_str)
|
item_standard_type = QTableWidgetItem(standard_type_str)
|
||||||
table.setItem(row, 4, item_standard_type)
|
table.setItem(row, 4, item_standard_type)
|
||||||
|
|
||||||
# Bit Depth Rule column (simple QTableWidgetItem for now)
|
# Bit Depth Policy column (simple QTableWidgetItem for now)
|
||||||
bit_depth_rule_str = details.get("bit_depth_rule", "")
|
bit_depth_policy_str = details.get("bit_depth_policy", "")
|
||||||
item_bit_depth_rule = QTableWidgetItem(bit_depth_rule_str)
|
item_bit_depth_policy = QTableWidgetItem(bit_depth_policy_str)
|
||||||
table.setItem(row, 5, item_bit_depth_rule)
|
table.setItem(row, 5, item_bit_depth_policy)
|
||||||
|
|
||||||
# Background color is now handled by the delegate's paint method based on data
|
# Background color is now handled by the delegate's paint method based on data
|
||||||
|
|
||||||
@ -1525,14 +1526,15 @@ class ConfigEditorDialog(QDialog):
|
|||||||
row += 1
|
row += 1
|
||||||
|
|
||||||
|
|
||||||
def populate_map_bit_depth_rules_table(self, table: QTableWidget, rules_data: dict):
|
# The populate_map_bit_depth_rules_table method is removed as per refactoring plan.
|
||||||
"""Populates the map bit depth rules table."""
|
# def populate_map_bit_depth_rules_table(self, table: QTableWidget, rules_data: dict):
|
||||||
table.setRowCount(len(rules_data))
|
# """Populates the map bit depth rules table."""
|
||||||
row = 0
|
# table.setRowCount(len(rules_data))
|
||||||
for map_type, rule in rules_data.items():
|
# row = 0
|
||||||
table.setItem(row, 0, QTableWidgetItem(map_type))
|
# for map_type, rule in rules_data.items():
|
||||||
table.setItem(row, 1, QTableWidgetItem(str(rule))) # Rule (respect/force_8bit)
|
# table.setItem(row, 0, QTableWidgetItem(map_type))
|
||||||
row += 1
|
# table.setItem(row, 1, QTableWidgetItem(str(rule))) # Rule (respect/force_8bit)
|
||||||
|
# row += 1
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@ -567,8 +567,8 @@ class DefinitionsEditorDialog(QDialog):
|
|||||||
|
|
||||||
# Bit Depth Rule
|
# Bit Depth Rule
|
||||||
self.ft_bit_depth_combo = QComboBox()
|
self.ft_bit_depth_combo = QComboBox()
|
||||||
self.ft_bit_depth_combo.addItems(["respect", "force_8bit", "force_16bit"])
|
self.ft_bit_depth_combo.addItems(["preserve", "force_8bit", "force_16bit"])
|
||||||
details_layout.addRow("Bit Depth Rule:", self.ft_bit_depth_combo)
|
details_layout.addRow("Bit Depth Policy:", self.ft_bit_depth_combo)
|
||||||
|
|
||||||
# Is Grayscale
|
# Is Grayscale
|
||||||
self.ft_is_grayscale_check = QCheckBox("Is Grayscale")
|
self.ft_is_grayscale_check = QCheckBox("Is Grayscale")
|
||||||
@ -606,7 +606,7 @@ class DefinitionsEditorDialog(QDialog):
|
|||||||
logger.warning(f"File type data for '{key}' is not a dict: {ft_data_item}. Using default.")
|
logger.warning(f"File type data for '{key}' is not a dict: {ft_data_item}. Using default.")
|
||||||
ft_data_item = {
|
ft_data_item = {
|
||||||
"description": str(ft_data_item), "color": "#ffffff", "examples": [],
|
"description": str(ft_data_item), "color": "#ffffff", "examples": [],
|
||||||
"standard_type": "", "bit_depth_rule": "respect",
|
"standard_type": "", "bit_depth_policy": "preserve",
|
||||||
"is_grayscale": False, "keybind": ""
|
"is_grayscale": False, "keybind": ""
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -615,7 +615,7 @@ class DefinitionsEditorDialog(QDialog):
|
|||||||
ft_data_item.setdefault('color', '#ffffff')
|
ft_data_item.setdefault('color', '#ffffff')
|
||||||
ft_data_item.setdefault('examples', [])
|
ft_data_item.setdefault('examples', [])
|
||||||
ft_data_item.setdefault('standard_type', '')
|
ft_data_item.setdefault('standard_type', '')
|
||||||
ft_data_item.setdefault('bit_depth_rule', 'respect')
|
ft_data_item.setdefault('bit_depth_policy', 'preserve')
|
||||||
ft_data_item.setdefault('is_grayscale', False)
|
ft_data_item.setdefault('is_grayscale', False)
|
||||||
ft_data_item.setdefault('keybind', '')
|
ft_data_item.setdefault('keybind', '')
|
||||||
|
|
||||||
@ -651,7 +651,7 @@ class DefinitionsEditorDialog(QDialog):
|
|||||||
logger.error(f"Invalid data for file type item {current_item.text()}. Expected dict, got {type(ft_data)}")
|
logger.error(f"Invalid data for file type item {current_item.text()}. Expected dict, got {type(ft_data)}")
|
||||||
ft_data = {
|
ft_data = {
|
||||||
"description": "Error: Invalid data", "color": "#ff0000", "examples": [],
|
"description": "Error: Invalid data", "color": "#ff0000", "examples": [],
|
||||||
"standard_type": "error", "bit_depth_rule": "respect",
|
"standard_type": "error", "bit_depth_policy": "preserve",
|
||||||
"is_grayscale": False, "keybind": "X"
|
"is_grayscale": False, "keybind": "X"
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -664,11 +664,11 @@ class DefinitionsEditorDialog(QDialog):
|
|||||||
|
|
||||||
self.ft_standard_type_edit.setText(ft_data.get('standard_type', ''))
|
self.ft_standard_type_edit.setText(ft_data.get('standard_type', ''))
|
||||||
|
|
||||||
bdr_index = self.ft_bit_depth_combo.findText(ft_data.get('bit_depth_rule', 'respect'))
|
bdr_index = self.ft_bit_depth_combo.findText(ft_data.get('bit_depth_policy', 'preserve'))
|
||||||
if bdr_index != -1:
|
if bdr_index != -1:
|
||||||
self.ft_bit_depth_combo.setCurrentIndex(bdr_index)
|
self.ft_bit_depth_combo.setCurrentIndex(bdr_index)
|
||||||
else:
|
else:
|
||||||
self.ft_bit_depth_combo.setCurrentIndex(0) # Default to 'respect'
|
self.ft_bit_depth_combo.setCurrentIndex(0) # Default to 'preserve'
|
||||||
|
|
||||||
self.ft_is_grayscale_check.setChecked(ft_data.get('is_grayscale', False))
|
self.ft_is_grayscale_check.setChecked(ft_data.get('is_grayscale', False))
|
||||||
self.ft_keybind_edit.setText(ft_data.get('keybind', ''))
|
self.ft_keybind_edit.setText(ft_data.get('keybind', ''))
|
||||||
@ -725,7 +725,7 @@ class DefinitionsEditorDialog(QDialog):
|
|||||||
"color": "#ffffff",
|
"color": "#ffffff",
|
||||||
"examples": [],
|
"examples": [],
|
||||||
"standard_type": "",
|
"standard_type": "",
|
||||||
"bit_depth_rule": "respect",
|
"bit_depth_policy": "preserve",
|
||||||
"is_grayscale": False,
|
"is_grayscale": False,
|
||||||
"keybind": ""
|
"keybind": ""
|
||||||
}
|
}
|
||||||
@ -869,7 +869,7 @@ class DefinitionsEditorDialog(QDialog):
|
|||||||
# Update based on which widget triggered (or update all)
|
# Update based on which widget triggered (or update all)
|
||||||
ft_data['description'] = self.ft_description_edit.toPlainText()
|
ft_data['description'] = self.ft_description_edit.toPlainText()
|
||||||
ft_data['standard_type'] = self.ft_standard_type_edit.text()
|
ft_data['standard_type'] = self.ft_standard_type_edit.text()
|
||||||
ft_data['bit_depth_rule'] = self.ft_bit_depth_combo.currentText()
|
ft_data['bit_depth_policy'] = self.ft_bit_depth_combo.currentText()
|
||||||
ft_data['is_grayscale'] = self.ft_is_grayscale_check.isChecked()
|
ft_data['is_grayscale'] = self.ft_is_grayscale_check.isChecked()
|
||||||
|
|
||||||
# Keybind validation (force uppercase)
|
# Keybind validation (force uppercase)
|
||||||
|
|||||||
@ -786,7 +786,8 @@ class MainWindow(QMainWindow):
|
|||||||
|
|
||||||
if RuleBasedPredictionHandler and self.prediction_thread is None:
|
if RuleBasedPredictionHandler and self.prediction_thread is None:
|
||||||
self.prediction_thread = QThread(self)
|
self.prediction_thread = QThread(self)
|
||||||
self.prediction_handler = RuleBasedPredictionHandler(input_source_identifier="", original_input_paths=[], preset_name="")
|
# Pass the Configuration object to the prediction handler
|
||||||
|
self.prediction_handler = RuleBasedPredictionHandler(config_obj=self.config, input_source_identifier="", original_input_paths=[], preset_name="")
|
||||||
self.prediction_handler.moveToThread(self.prediction_thread)
|
self.prediction_handler.moveToThread(self.prediction_thread)
|
||||||
|
|
||||||
self.start_prediction_signal.connect(self.prediction_handler.run_prediction, Qt.ConnectionType.QueuedConnection)
|
self.start_prediction_signal.connect(self.prediction_handler.run_prediction, Qt.ConnectionType.QueuedConnection)
|
||||||
|
|||||||
@ -6,7 +6,7 @@ import re
|
|||||||
import tempfile
|
import tempfile
|
||||||
import zipfile
|
import zipfile
|
||||||
from collections import defaultdict, Counter
|
from collections import defaultdict, Counter
|
||||||
from typing import List, Dict, Any
|
from typing import List, Dict, Any, Set, Tuple # Added Set, Tuple
|
||||||
|
|
||||||
# --- PySide6 Imports ---
|
# --- PySide6 Imports ---
|
||||||
from PySide6.QtCore import QObject, Slot # Keep QObject for parent type hint, Slot for classify_files if kept as method
|
from PySide6.QtCore import QObject, Slot # Keep QObject for parent type hint, Slot for classify_files if kept as method
|
||||||
@ -303,17 +303,19 @@ class RuleBasedPredictionHandler(BasePredictionHandler):
|
|||||||
Inherits from BasePredictionHandler for common threading and signaling.
|
Inherits from BasePredictionHandler for common threading and signaling.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def __init__(self, input_source_identifier: str, original_input_paths: list[str], preset_name: str, parent: QObject = None):
|
def __init__(self, config_obj: Configuration, input_source_identifier: str, original_input_paths: list[str], preset_name: str, parent: QObject = None):
|
||||||
"""
|
"""
|
||||||
Initializes the rule-based handler.
|
Initializes the rule-based handler with a Configuration object.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
|
config_obj: The main configuration object.
|
||||||
input_source_identifier: The unique identifier for the input source (e.g., file path).
|
input_source_identifier: The unique identifier for the input source (e.g., file path).
|
||||||
original_input_paths: List of absolute file paths extracted from the source.
|
original_input_paths: List of absolute file paths extracted from the source.
|
||||||
preset_name: The name of the preset configuration to use.
|
preset_name: The name of the preset configuration to use.
|
||||||
parent: The parent QObject.
|
parent: The parent QObject.
|
||||||
"""
|
"""
|
||||||
super().__init__(input_source_identifier, parent)
|
super().__init__(input_source_identifier, parent)
|
||||||
|
self.config = config_obj # Store the Configuration object
|
||||||
self.original_input_paths = original_input_paths
|
self.original_input_paths = original_input_paths
|
||||||
self.preset_name = preset_name
|
self.preset_name = preset_name
|
||||||
self._current_input_path = None
|
self._current_input_path = None
|
||||||
@ -362,16 +364,24 @@ class RuleBasedPredictionHandler(BasePredictionHandler):
|
|||||||
log.warning(f"Input source path does not exist: '{input_source_identifier}'. Skipping prediction.")
|
log.warning(f"Input source path does not exist: '{input_source_identifier}'. Skipping prediction.")
|
||||||
raise FileNotFoundError(f"Input source path not found: {input_source_identifier}")
|
raise FileNotFoundError(f"Input source path not found: {input_source_identifier}")
|
||||||
|
|
||||||
# --- Load Configuration ---
|
# --- Use Provided Configuration ---
|
||||||
config = Configuration(preset_name)
|
# The Configuration object is now passed during initialization.
|
||||||
log.info(f"Successfully loaded configuration for preset '{preset_name}'.")
|
# Ensure the correct preset is loaded in the passed config object if necessary,
|
||||||
|
# or rely on the caller (MainWindow) to ensure the config object is in the correct state.
|
||||||
|
# MainWindow's load_preset method re-initializes the config, so it should be correct.
|
||||||
|
# We just need to use the stored self.config.
|
||||||
|
log.info(f"Using provided configuration object for preset '{preset_name}'.")
|
||||||
|
# No need to create a new Configuration instance here.
|
||||||
|
# config = Configuration(preset_name) # REMOVED
|
||||||
|
# log.info(f"Successfully loaded configuration for preset '{preset_name}'.") # REMOVED
|
||||||
|
|
||||||
if self._is_cancelled: raise RuntimeError("Prediction cancelled before classification.")
|
if self._is_cancelled: raise RuntimeError("Prediction cancelled before classification.")
|
||||||
|
|
||||||
# --- Perform Classification ---
|
# --- Perform Classification ---
|
||||||
self.status_update.emit(f"Classifying files for '{source_path.name}'...")
|
self.status_update.emit(f"Classifying files for '{source_path.name}'...")
|
||||||
try:
|
try:
|
||||||
classified_assets = classify_files(original_input_paths, config)
|
# Use the stored config object
|
||||||
|
classified_assets = classify_files(original_input_paths, self.config)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
log.exception(f"Error during file classification for source '{input_source_identifier}': {e}")
|
log.exception(f"Error during file classification for source '{input_source_identifier}': {e}")
|
||||||
raise RuntimeError(f"Error classifying files: {e}") from e
|
raise RuntimeError(f"Error classifying files: {e}") from e
|
||||||
@ -388,26 +398,29 @@ class RuleBasedPredictionHandler(BasePredictionHandler):
|
|||||||
# --- Build the Hierarchy ---
|
# --- Build the Hierarchy ---
|
||||||
self.status_update.emit(f"Building rule hierarchy for '{source_path.name}'...")
|
self.status_update.emit(f"Building rule hierarchy for '{source_path.name}'...")
|
||||||
try:
|
try:
|
||||||
supplier_identifier = config.supplier_name
|
# Use the stored config object
|
||||||
|
supplier_identifier = self.config.supplier_name
|
||||||
source_rule = SourceRule(
|
source_rule = SourceRule(
|
||||||
input_path=input_source_identifier,
|
input_path=input_source_identifier,
|
||||||
supplier_identifier=supplier_identifier,
|
supplier_identifier=supplier_identifier,
|
||||||
# Use the internal display name from the config object
|
# Use the internal display name from the stored config object
|
||||||
preset_name=config.internal_display_preset_name
|
preset_name=self.config.internal_display_preset_name
|
||||||
)
|
)
|
||||||
asset_rules = []
|
asset_rules = []
|
||||||
file_type_definitions = config._core_settings.get('FILE_TYPE_DEFINITIONS', {})
|
# Access file type definitions via the public getter method from the stored config object
|
||||||
|
file_type_definitions = self.config.get_file_type_definitions_with_examples()
|
||||||
|
|
||||||
for asset_name, files_info in classified_assets.items():
|
for asset_name, files_info in classified_assets.items():
|
||||||
if self._is_cancelled: raise RuntimeError("Prediction cancelled during hierarchy building (assets).")
|
if self._is_cancelled: raise RuntimeError("Prediction cancelled during hierarchy building (assets).")
|
||||||
if not files_info: continue
|
if not files_info: continue
|
||||||
|
|
||||||
asset_category_rules = config.asset_category_rules
|
# Use the stored config object
|
||||||
asset_type_definitions = config.get_asset_type_definitions()
|
asset_category_rules = self.config.asset_category_rules
|
||||||
|
asset_type_definitions = self.config.get_asset_type_definitions()
|
||||||
asset_type_keys = list(asset_type_definitions.keys())
|
asset_type_keys = list(asset_type_definitions.keys())
|
||||||
|
|
||||||
# Initialize predicted_asset_type using the validated default
|
# Initialize predicted_asset_type using the validated default from stored config
|
||||||
predicted_asset_type = config.default_asset_category
|
predicted_asset_type = self.config.default_asset_category
|
||||||
log.debug(f"Asset '{asset_name}': Initial predicted_asset_type set to default: '{predicted_asset_type}'.")
|
log.debug(f"Asset '{asset_name}': Initial predicted_asset_type set to default: '{predicted_asset_type}'.")
|
||||||
|
|
||||||
# 1. Check asset_category_rules from preset
|
# 1. Check asset_category_rules from preset
|
||||||
@ -415,7 +428,8 @@ class RuleBasedPredictionHandler(BasePredictionHandler):
|
|||||||
|
|
||||||
# Check for Model type based on file patterns
|
# Check for Model type based on file patterns
|
||||||
if "Model" in asset_type_keys:
|
if "Model" in asset_type_keys:
|
||||||
model_patterns_regex = config.compiled_model_regex
|
# Use the stored config object
|
||||||
|
model_patterns_regex = self.config.compiled_model_regex
|
||||||
for f_info in files_info:
|
for f_info in files_info:
|
||||||
if f_info['item_type'] in ["EXTRA", "FILE_IGNORE"]:
|
if f_info['item_type'] in ["EXTRA", "FILE_IGNORE"]:
|
||||||
continue
|
continue
|
||||||
@ -447,12 +461,13 @@ class RuleBasedPredictionHandler(BasePredictionHandler):
|
|||||||
pass
|
pass
|
||||||
|
|
||||||
# 2. If not determined by specific rules, check for Surface (if not Model/Decal by rule)
|
# 2. If not determined by specific rules, check for Surface (if not Model/Decal by rule)
|
||||||
if not determined_by_rule and predicted_asset_type == config.default_asset_category and "Surface" in asset_type_keys:
|
if not determined_by_rule and predicted_asset_type == self.config.default_asset_category and "Surface" in asset_type_keys:
|
||||||
item_types_in_asset = {f_info['item_type'] for f_info in files_info}
|
item_types_in_asset = {f_info['item_type'] for f_info in files_info}
|
||||||
# Ensure we are checking against standard map types from FILE_TYPE_DEFINITIONS
|
# Ensure we are checking against standard map types from FILE_TYPE_DEFINITIONS
|
||||||
# This check is primarily for PBR texture sets.
|
# This check is primarily for PBR texture sets.
|
||||||
|
# Use the stored config object
|
||||||
material_indicators = {
|
material_indicators = {
|
||||||
ft_key for ft_key, ft_def in config.get_file_type_definitions_with_examples().items()
|
ft_key for ft_key, ft_def in self.config.get_file_type_definitions_with_examples().items()
|
||||||
if ft_def.get('standard_type') and ft_def.get('standard_type') not in ["", "EXTRA", "FILE_IGNORE", "MODEL"]
|
if ft_def.get('standard_type') and ft_def.get('standard_type') not in ["", "EXTRA", "FILE_IGNORE", "MODEL"]
|
||||||
}
|
}
|
||||||
# Add common direct standard types as well for robustness
|
# Add common direct standard types as well for robustness
|
||||||
@ -466,7 +481,7 @@ class RuleBasedPredictionHandler(BasePredictionHandler):
|
|||||||
has_material_map = True
|
has_material_map = True
|
||||||
break
|
break
|
||||||
# Check standard type if item_type is a key in FILE_TYPE_DEFINITIONS
|
# Check standard type if item_type is a key in FILE_TYPE_DEFINITIONS
|
||||||
item_def = config.get_file_type_definitions_with_examples().get(item_type)
|
item_def = self.config.get_file_type_definitions_with_examples().get(item_type)
|
||||||
if item_def and item_def.get('standard_type') in material_indicators:
|
if item_def and item_def.get('standard_type') in material_indicators:
|
||||||
has_material_map = True
|
has_material_map = True
|
||||||
break
|
break
|
||||||
@ -478,8 +493,8 @@ class RuleBasedPredictionHandler(BasePredictionHandler):
|
|||||||
# 3. Final validation: Ensure predicted_asset_type is a valid key.
|
# 3. Final validation: Ensure predicted_asset_type is a valid key.
|
||||||
if predicted_asset_type not in asset_type_keys:
|
if predicted_asset_type not in asset_type_keys:
|
||||||
log.warning(f"Derived AssetType '{predicted_asset_type}' for asset '{asset_name}' is not in ASSET_TYPE_DEFINITIONS. "
|
log.warning(f"Derived AssetType '{predicted_asset_type}' for asset '{asset_name}' is not in ASSET_TYPE_DEFINITIONS. "
|
||||||
f"Falling back to default: '{config.default_asset_category}'.")
|
f"Falling back to default: '{self.config.default_asset_category}'.")
|
||||||
predicted_asset_type = config.default_asset_category
|
predicted_asset_type = self.config.default_asset_category
|
||||||
|
|
||||||
asset_rule = AssetRule(asset_name=asset_name, asset_type=predicted_asset_type)
|
asset_rule = AssetRule(asset_name=asset_name, asset_type=predicted_asset_type)
|
||||||
file_rules = []
|
file_rules = []
|
||||||
@ -494,7 +509,8 @@ class RuleBasedPredictionHandler(BasePredictionHandler):
|
|||||||
# No need for the old MAP_ prefixing logic here.
|
# No need for the old MAP_ prefixing logic here.
|
||||||
|
|
||||||
# Validate the final_item_type against definitions, unless it's EXTRA or FILE_IGNORE
|
# Validate the final_item_type against definitions, unless it's EXTRA or FILE_IGNORE
|
||||||
if final_item_type not in ["EXTRA", "FILE_IGNORE"] and file_type_definitions and final_item_type not in file_type_definitions:
|
# Use the stored config object
|
||||||
|
if final_item_type not in ["EXTRA", "FILE_IGNORE"] and self.config.get_file_type_definitions_with_examples() and final_item_type not in self.config.get_file_type_definitions_with_examples():
|
||||||
log.warning(f"Predicted ItemType '{final_item_type}' for file '{file_info['file_path']}' is not in FILE_TYPE_DEFINITIONS. Setting to FILE_IGNORE.")
|
log.warning(f"Predicted ItemType '{final_item_type}' for file '{file_info['file_path']}' is not in FILE_TYPE_DEFINITIONS. Setting to FILE_IGNORE.")
|
||||||
final_item_type = "FILE_IGNORE"
|
final_item_type = "FILE_IGNORE"
|
||||||
|
|
||||||
|
|||||||
@ -64,7 +64,7 @@ class InitialScalingOutput:
|
|||||||
@dataclass
|
@dataclass
|
||||||
class SaveVariantsInput:
|
class SaveVariantsInput:
|
||||||
image_data: np.ndarray # Final data (potentially scaled)
|
image_data: np.ndarray # Final data (potentially scaled)
|
||||||
internal_map_type: str # Final internal type (e.g., MAP_ROUGH, MAP_COL-1)
|
final_internal_map_type: str # Final internal type (e.g., MAP_ROUGH, MAP_COL-1)
|
||||||
source_bit_depth_info: List[int]
|
source_bit_depth_info: List[int]
|
||||||
# Configuration needed
|
# Configuration needed
|
||||||
output_filename_pattern_tokens: Dict[str, Any]
|
output_filename_pattern_tokens: Dict[str, Any]
|
||||||
|
|||||||
@ -284,11 +284,11 @@ class PipelineOrchestrator:
|
|||||||
|
|
||||||
save_input = SaveVariantsInput(
|
save_input = SaveVariantsInput(
|
||||||
image_data=current_image_data,
|
image_data=current_image_data,
|
||||||
internal_map_type=item.map_type_identifier,
|
final_internal_map_type=item.map_type_identifier,
|
||||||
source_bit_depth_info=[item.bit_depth] if item.bit_depth is not None else [8], # Default to 8 if not set
|
source_bit_depth_info=[item.bit_depth] if item.bit_depth is not None else [8], # Default to 8 if not set
|
||||||
output_filename_pattern_tokens=output_filename_tokens,
|
output_filename_pattern_tokens=output_filename_tokens,
|
||||||
image_resolutions=save_specific_resolutions, # Pass the specific resolution(s)
|
image_resolutions=save_specific_resolutions, # Pass the specific resolution(s)
|
||||||
file_type_defs=getattr(context.config_obj, "FILE_TYPE_DEFINITIONS", {}),
|
file_type_defs=context.config_obj.get_file_type_definitions_with_examples(),
|
||||||
output_format_8bit=context.config_obj.get_8bit_output_format(),
|
output_format_8bit=context.config_obj.get_8bit_output_format(),
|
||||||
output_format_16bit_primary=context.config_obj.get_16bit_output_formats()[0],
|
output_format_16bit_primary=context.config_obj.get_16bit_output_formats()[0],
|
||||||
output_format_16bit_fallback=context.config_obj.get_16bit_output_formats()[1],
|
output_format_16bit_fallback=context.config_obj.get_16bit_output_formats()[1],
|
||||||
@ -378,7 +378,7 @@ class PipelineOrchestrator:
|
|||||||
# The `image_saving_utils.save_image_variants` will iterate through `context.config_obj.image_resolutions`.
|
# The `image_saving_utils.save_image_variants` will iterate through `context.config_obj.image_resolutions`.
|
||||||
save_input = SaveVariantsInput(
|
save_input = SaveVariantsInput(
|
||||||
image_data=current_image_data,
|
image_data=current_image_data,
|
||||||
internal_map_type=processed_data.output_map_type,
|
final_internal_map_type=processed_data.output_map_type,
|
||||||
source_bit_depth_info=processed_data.source_bit_depths,
|
source_bit_depth_info=processed_data.source_bit_depths,
|
||||||
output_filename_pattern_tokens=output_filename_tokens,
|
output_filename_pattern_tokens=output_filename_tokens,
|
||||||
image_resolutions=context.config_obj.image_resolutions, # Pass all configured resolutions
|
image_resolutions=context.config_obj.image_resolutions, # Pass all configured resolutions
|
||||||
|
|||||||
@ -97,7 +97,7 @@ class OutputOrganizationStage(ProcessingStage):
|
|||||||
token_data_variant = {
|
token_data_variant = {
|
||||||
"assetname": asset_name_for_log,
|
"assetname": asset_name_for_log,
|
||||||
"supplier": context.effective_supplier or "DefaultSupplier",
|
"supplier": context.effective_supplier or "DefaultSupplier",
|
||||||
"asset_category": context.asset_rule.asset_category, # Added asset_category
|
"asset_category": context.asset_rule.asset_type, # Used asset_type for asset_category token
|
||||||
"maptype": base_map_type,
|
"maptype": base_map_type,
|
||||||
"resolution": variant_resolution_key,
|
"resolution": variant_resolution_key,
|
||||||
"ext": variant_ext,
|
"ext": variant_ext,
|
||||||
@ -165,7 +165,7 @@ class OutputOrganizationStage(ProcessingStage):
|
|||||||
token_data = {
|
token_data = {
|
||||||
"assetname": asset_name_for_log,
|
"assetname": asset_name_for_log,
|
||||||
"supplier": context.effective_supplier or "DefaultSupplier",
|
"supplier": context.effective_supplier or "DefaultSupplier",
|
||||||
"asset_category": context.asset_rule.asset_category, # Added asset_category
|
"asset_category": context.asset_rule.asset_type, # Used asset_type for asset_category token
|
||||||
"maptype": base_map_type,
|
"maptype": base_map_type,
|
||||||
"resolution": resolution_str,
|
"resolution": resolution_str,
|
||||||
"ext": temp_file_path.suffix.lstrip('.'),
|
"ext": temp_file_path.suffix.lstrip('.'),
|
||||||
@ -246,7 +246,7 @@ class OutputOrganizationStage(ProcessingStage):
|
|||||||
base_token_data = {
|
base_token_data = {
|
||||||
"assetname": asset_name_for_log,
|
"assetname": asset_name_for_log,
|
||||||
"supplier": context.effective_supplier or "DefaultSupplier",
|
"supplier": context.effective_supplier or "DefaultSupplier",
|
||||||
"asset_category": context.asset_rule.asset_category, # Added asset_category
|
"asset_category": context.asset_rule.asset_type, # Used asset_type for asset_category token
|
||||||
# Add other tokens if your output_directory_pattern uses them at the asset level
|
# Add other tokens if your output_directory_pattern uses them at the asset level
|
||||||
"incrementingvalue": getattr(context, 'incrementing_value', None),
|
"incrementingvalue": getattr(context, 'incrementing_value', None),
|
||||||
"sha5": getattr(context, 'sha5_value', None)
|
"sha5": getattr(context, 'sha5_value', None)
|
||||||
|
|||||||
@ -178,12 +178,20 @@ class RegularMapProcessorStage(ProcessingStage):
|
|||||||
log.debug(f"{log_prefix}: Loaded image {result.original_dimensions[0]}x{result.original_dimensions[1]}.")
|
log.debug(f"{log_prefix}: Loaded image {result.original_dimensions[0]}x{result.original_dimensions[1]}.")
|
||||||
|
|
||||||
# Get original bit depth
|
# Get original bit depth
|
||||||
try:
|
# Determine original bit depth from the loaded image data's dtype
|
||||||
result.original_bit_depth = ipu.get_image_bit_depth(str(source_file_path_found))
|
dtype_to_bit_depth = {
|
||||||
log.info(f"{log_prefix}: Determined source bit depth: {result.original_bit_depth}")
|
np.dtype('uint8'): 8,
|
||||||
except Exception as e:
|
np.dtype('uint16'): 16,
|
||||||
log.warning(f"{log_prefix}: Could not determine source bit depth for {source_file_path_found}: {e}. Setting to None.")
|
np.dtype('float32'): 32,
|
||||||
result.original_bit_depth = None # Indicate failure to determine
|
np.dtype('int8'): 8,
|
||||||
|
np.dtype('int16'): 16,
|
||||||
|
}
|
||||||
|
result.original_bit_depth = dtype_to_bit_depth.get(source_image_data.dtype)
|
||||||
|
|
||||||
|
if result.original_bit_depth is None:
|
||||||
|
log.warning(f"{log_prefix}: Unknown dtype {source_image_data.dtype} for loaded image data, cannot determine bit depth. Setting to None.")
|
||||||
|
else:
|
||||||
|
log.info(f"{log_prefix}: Determined source bit depth from loaded data dtype: {result.original_bit_depth}")
|
||||||
|
|
||||||
# --- Apply Transformations ---
|
# --- Apply Transformations ---
|
||||||
transformed_image_data, final_map_type, transform_notes = ipu.apply_common_map_transformations(
|
transformed_image_data, final_map_type, transform_notes = ipu.apply_common_map_transformations(
|
||||||
@ -197,6 +205,11 @@ class RegularMapProcessorStage(ProcessingStage):
|
|||||||
result.final_internal_map_type = final_map_type # Update if Gloss->Rough changed it
|
result.final_internal_map_type = final_map_type # Update if Gloss->Rough changed it
|
||||||
result.transformations_applied = transform_notes
|
result.transformations_applied = transform_notes
|
||||||
|
|
||||||
|
# Log dtype and shape after transformations
|
||||||
|
log.info(f"{log_prefix}: Image data dtype after transformations: {transformed_image_data.dtype}, shape: {transformed_image_data.shape}")
|
||||||
|
bit_depth_after_transform = dtype_to_bit_depth.get(transformed_image_data.dtype)
|
||||||
|
log.info(f"{log_prefix}: Determined bit depth after transformations: {bit_depth_after_transform}")
|
||||||
|
|
||||||
# --- Determine Resolution Key for LOWRES ---
|
# --- Determine Resolution Key for LOWRES ---
|
||||||
if config.enable_low_resolution_fallback and result.original_dimensions:
|
if config.enable_low_resolution_fallback and result.original_dimensions:
|
||||||
w, h = result.original_dimensions
|
w, h = result.original_dimensions
|
||||||
@ -208,6 +221,7 @@ class RegularMapProcessorStage(ProcessingStage):
|
|||||||
result.status = "Processed"
|
result.status = "Processed"
|
||||||
result.error_message = None
|
result.error_message = None
|
||||||
log.info(f"{log_prefix}: Successfully processed regular map. Final type: '{result.final_internal_map_type}', ResolutionKey: {result.resolution_key}.")
|
log.info(f"{log_prefix}: Successfully processed regular map. Final type: '{result.final_internal_map_type}', ResolutionKey: {result.resolution_key}.")
|
||||||
|
log.debug(f"{log_prefix}: Processed image data dtype before returning: {result.processed_image_data.dtype}, shape: {result.processed_image_data.shape}")
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
log.exception(f"{log_prefix}: Unhandled exception during processing: {e}")
|
log.exception(f"{log_prefix}: Unhandled exception during processing: {e}")
|
||||||
|
|||||||
@ -22,7 +22,7 @@ class SaveVariantsStage(ProcessingStage):
|
|||||||
"""
|
"""
|
||||||
Calls isu.save_image_variants with data from input_data.
|
Calls isu.save_image_variants with data from input_data.
|
||||||
"""
|
"""
|
||||||
internal_map_type = input_data.internal_map_type
|
internal_map_type = input_data.final_internal_map_type
|
||||||
# The input_data for SaveVariantsStage doesn't directly contain the ProcessingItem.
|
# The input_data for SaveVariantsStage doesn't directly contain the ProcessingItem.
|
||||||
# It receives data *derived* from a ProcessingItem by previous stages.
|
# It receives data *derived* from a ProcessingItem by previous stages.
|
||||||
# For debugging, we'd need to pass more context or rely on what's in output_filename_pattern_tokens.
|
# For debugging, we'd need to pass more context or rely on what's in output_filename_pattern_tokens.
|
||||||
@ -59,7 +59,7 @@ class SaveVariantsStage(ProcessingStage):
|
|||||||
|
|
||||||
save_args = {
|
save_args = {
|
||||||
"source_image_data": input_data.image_data,
|
"source_image_data": input_data.image_data,
|
||||||
"base_map_type": base_map_type_friendly, # Use the friendly type
|
"final_internal_map_type": input_data.final_internal_map_type, # Pass the internal type identifier
|
||||||
"source_bit_depth_info": input_data.source_bit_depth_info,
|
"source_bit_depth_info": input_data.source_bit_depth_info,
|
||||||
"image_resolutions": input_data.image_resolutions,
|
"image_resolutions": input_data.image_resolutions,
|
||||||
"file_type_defs": input_data.file_type_defs,
|
"file_type_defs": input_data.file_type_defs,
|
||||||
|
|||||||
@ -304,9 +304,11 @@ def load_image(image_path: Union[str, Path], read_flag: int = cv2.IMREAD_UNCHANG
|
|||||||
try:
|
try:
|
||||||
img = cv2.imread(str(image_path), read_flag)
|
img = cv2.imread(str(image_path), read_flag)
|
||||||
if img is None:
|
if img is None:
|
||||||
# print(f"Warning: Failed to load image: {image_path}") # Optional: for debugging utils
|
ipu_log.warning(f"Failed to load image: {image_path}")
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
ipu_log.debug(f"Loaded image '{image_path}'. Initial dtype: {img.dtype}, shape: {img.shape}")
|
||||||
|
|
||||||
# Ensure RGB/RGBA for color images
|
# Ensure RGB/RGBA for color images
|
||||||
if len(img.shape) == 3:
|
if len(img.shape) == 3:
|
||||||
if img.shape[2] == 4: # BGRA from OpenCV
|
if img.shape[2] == 4: # BGRA from OpenCV
|
||||||
@ -392,8 +394,11 @@ def save_image(
|
|||||||
path_obj = Path(image_path)
|
path_obj = Path(image_path)
|
||||||
path_obj.parent.mkdir(parents=True, exist_ok=True)
|
path_obj.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
ipu_log.debug(f"Saving image '{path_obj}'. Initial data dtype: {img_to_save.dtype}, shape: {img_to_save.shape}")
|
||||||
|
|
||||||
# 1. Data Type Conversion
|
# 1. Data Type Conversion
|
||||||
if output_dtype_target is not None:
|
if output_dtype_target is not None:
|
||||||
|
ipu_log.debug(f"Attempting to convert image data to target dtype: {output_dtype_target}")
|
||||||
if output_dtype_target == np.uint8 and img_to_save.dtype != np.uint8:
|
if output_dtype_target == np.uint8 and img_to_save.dtype != np.uint8:
|
||||||
if img_to_save.dtype == np.uint16: img_to_save = (img_to_save.astype(np.float32) / 65535.0 * 255.0).astype(np.uint8)
|
if img_to_save.dtype == np.uint16: img_to_save = (img_to_save.astype(np.float32) / 65535.0 * 255.0).astype(np.uint8)
|
||||||
elif img_to_save.dtype in [np.float16, np.float32, np.float64]: img_to_save = (np.clip(img_to_save, 0.0, 1.0) * 255.0).astype(np.uint8)
|
elif img_to_save.dtype in [np.float16, np.float32, np.float64]: img_to_save = (np.clip(img_to_save, 0.0, 1.0) * 255.0).astype(np.uint8)
|
||||||
@ -413,6 +418,8 @@ def save_image(
|
|||||||
elif img_to_save.dtype == np.float16: img_to_save = img_to_save.astype(np.float32)
|
elif img_to_save.dtype == np.float16: img_to_save = img_to_save.astype(np.float32)
|
||||||
|
|
||||||
|
|
||||||
|
ipu_log.debug(f"Saving image '{path_obj}'. Data dtype after conversion attempt: {img_to_save.dtype}, shape: {img_to_save.shape}")
|
||||||
|
|
||||||
# 2. Color Space Conversion (Internal RGB/RGBA -> BGR/BGRA for OpenCV)
|
# 2. Color Space Conversion (Internal RGB/RGBA -> BGR/BGRA for OpenCV)
|
||||||
# Input `image_data` is assumed to be in RGB/RGBA format (due to `load_image` changes).
|
# Input `image_data` is assumed to be in RGB/RGBA format (due to `load_image` changes).
|
||||||
# OpenCV's `imwrite` typically expects BGR/BGRA for formats like PNG, JPG.
|
# OpenCV's `imwrite` typically expects BGR/BGRA for formats like PNG, JPG.
|
||||||
@ -460,6 +467,8 @@ def apply_common_map_transformations(
|
|||||||
current_image_data = image_data # Start with original data
|
current_image_data = image_data # Start with original data
|
||||||
updated_processing_map_type = processing_map_type # Start with original type
|
updated_processing_map_type = processing_map_type # Start with original type
|
||||||
|
|
||||||
|
ipu_log.debug(f"{log_prefix}: apply_common_map_transformations - Initial image data dtype: {current_image_data.dtype}, shape: {current_image_data.shape}")
|
||||||
|
|
||||||
# Gloss-to-Rough
|
# Gloss-to-Rough
|
||||||
# Check if the base type is Gloss (before suffix)
|
# Check if the base type is Gloss (before suffix)
|
||||||
base_map_type_match = re.match(r"(MAP_GLOSS)", processing_map_type)
|
base_map_type_match = re.match(r"(MAP_GLOSS)", processing_map_type)
|
||||||
@ -494,6 +503,8 @@ def apply_common_map_transformations(
|
|||||||
current_image_data = invert_normal_map_green_channel(current_image_data)
|
current_image_data = invert_normal_map_green_channel(current_image_data)
|
||||||
transformation_notes.append("Normal Green Inverted (Global)")
|
transformation_notes.append("Normal Green Inverted (Global)")
|
||||||
|
|
||||||
|
ipu_log.debug(f"{log_prefix}: apply_common_map_transformations - Final image data dtype: {current_image_data.dtype}, shape: {current_image_data.shape}")
|
||||||
|
|
||||||
return current_image_data, updated_processing_map_type, transformation_notes
|
return current_image_data, updated_processing_map_type, transformation_notes
|
||||||
|
|
||||||
# --- Normal Map Utilities ---
|
# --- Normal Map Utilities ---
|
||||||
|
|||||||
@ -4,6 +4,9 @@ import numpy as np
|
|||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import List, Dict, Any, Tuple, Optional
|
from typing import List, Dict, Any, Tuple, Optional
|
||||||
|
|
||||||
|
# Import necessary utility functions
|
||||||
|
from utils.path_utils import get_filename_friendly_map_type # Import the function
|
||||||
|
|
||||||
# Potentially import ipu from ...utils import image_processing_utils as ipu
|
# Potentially import ipu from ...utils import image_processing_utils as ipu
|
||||||
# Assuming ipu is available in the same utils directory or parent
|
# Assuming ipu is available in the same utils directory or parent
|
||||||
try:
|
try:
|
||||||
@ -22,7 +25,7 @@ logger = logging.getLogger(__name__)
|
|||||||
|
|
||||||
def save_image_variants(
|
def save_image_variants(
|
||||||
source_image_data: np.ndarray,
|
source_image_data: np.ndarray,
|
||||||
base_map_type: str, # Filename-friendly map type
|
final_internal_map_type: str, # Use the internal map type identifier
|
||||||
source_bit_depth_info: List[Optional[int]],
|
source_bit_depth_info: List[Optional[int]],
|
||||||
image_resolutions: Dict[str, int],
|
image_resolutions: Dict[str, int],
|
||||||
file_type_defs: Dict[str, Dict[str, Any]],
|
file_type_defs: Dict[str, Dict[str, Any]],
|
||||||
@ -42,14 +45,13 @@ def save_image_variants(
|
|||||||
|
|
||||||
Args:
|
Args:
|
||||||
source_image_data (np.ndarray): High-res image data (in memory, potentially transformed).
|
source_image_data (np.ndarray): High-res image data (in memory, potentially transformed).
|
||||||
base_map_type (str): Final map type (e.g., "COL", "ROUGH", "NORMAL", "MAP_NRMRGH").
|
final_internal_map_type (str): Final internal map type (e.g., "MAP_COL", "MAP_NRM", "MAP_NRMRGH").
|
||||||
This is the filename-friendly map type.
|
|
||||||
source_bit_depth_info (List[Optional[int]]): List of original source bit depth(s)
|
source_bit_depth_info (List[Optional[int]]): List of original source bit depth(s)
|
||||||
(e.g., [8], [16], [8, 16]). Can contain None.
|
(e.g., [8], [16], [8, 16]). Can contain None.
|
||||||
image_resolutions (Dict[str, int]): Dictionary mapping resolution keys (e.g., "4K")
|
image_resolutions (Dict[str, int]): Dictionary mapping resolution keys (e.g., "4K")
|
||||||
to max dimensions (e.g., 4096).
|
to max dimensions (e.g., 4096).
|
||||||
file_type_defs (Dict[str, Dict[str, Any]]): Dictionary defining properties for map types,
|
file_type_defs (Dict[str, Dict[str, Any]]): Dictionary defining properties for map types,
|
||||||
including 'bit_depth_rule'.
|
including 'bit_depth_policy'.
|
||||||
output_format_8bit (str): File extension for 8-bit output (e.g., "jpg", "png").
|
output_format_8bit (str): File extension for 8-bit output (e.g., "jpg", "png").
|
||||||
output_format_16bit_primary (str): Primary file extension for 16-bit output (e.g., "png", "tif").
|
output_format_16bit_primary (str): Primary file extension for 16-bit output (e.g., "png", "tif").
|
||||||
output_format_16bit_fallback (str): Fallback file extension for 16-bit output.
|
output_format_16bit_fallback (str): Fallback file extension for 16-bit output.
|
||||||
@ -76,30 +78,46 @@ def save_image_variants(
|
|||||||
source_max_dim = max(source_h, source_w)
|
source_max_dim = max(source_h, source_w)
|
||||||
|
|
||||||
# 1. Use provided configuration inputs (already available as function arguments)
|
# 1. Use provided configuration inputs (already available as function arguments)
|
||||||
logger.info(f"SaveImageVariants: Starting for map type: {base_map_type}. Source shape: {source_image_data.shape}, Source bit depths: {source_bit_depth_info}")
|
logger.info(f"SaveImageVariants: Starting for map type: {final_internal_map_type}. Source shape: {source_image_data.shape}, Source bit depths: {source_bit_depth_info}")
|
||||||
logger.debug(f"SaveImageVariants: Resolutions: {image_resolutions}, File Type Defs: {file_type_defs.keys()}, Output Formats: 8bit={output_format_8bit}, 16bit_pri={output_format_16bit_primary}, 16bit_fall={output_format_16bit_fallback}")
|
logger.debug(f"SaveImageVariants: Resolutions: {image_resolutions}, File Type Defs: {file_type_defs.keys()}, Output Formats: 8bit={output_format_8bit}, 16bit_pri={output_format_16bit_primary}, 16bit_fall={output_format_16bit_fallback}")
|
||||||
logger.debug(f"SaveImageVariants: PNG Comp: {png_compression_level}, JPG Qual: {jpg_quality}")
|
logger.debug(f"SaveImageVariants: PNG Comp: {png_compression_level}, JPG Qual: {jpg_quality}")
|
||||||
logger.debug(f"SaveImageVariants: Output Tokens: {output_filename_pattern_tokens}, Output Pattern: {output_filename_pattern}")
|
logger.debug(f"SaveImageVariants: Output Tokens: {output_filename_pattern_tokens}, Output Pattern: {output_filename_pattern}")
|
||||||
logger.debug(f"SaveImageVariants: Received resolution_threshold_for_jpg: {resolution_threshold_for_jpg}") # Log received threshold
|
logger.debug(f"SaveImageVariants: Received resolution_threshold_for_jpg: {resolution_threshold_for_jpg}") # Log received threshold
|
||||||
|
|
||||||
# 2. Determine Target Bit Depth
|
# 2. Determine Target Bit Depth based on bit_depth_policy
|
||||||
target_bit_depth = 8 # Default
|
# Use the final_internal_map_type for lookup in file_type_defs
|
||||||
bit_depth_rule = file_type_defs.get(base_map_type, {}).get('bit_depth_rule', 'force_8bit')
|
bit_depth_policy = file_type_defs.get(final_internal_map_type, {}).get('bit_depth_policy', '')
|
||||||
if bit_depth_rule not in ['force_8bit', 'respect_inputs']:
|
|
||||||
logger.warning(f"Unknown bit_depth_rule '{bit_depth_rule}' for map type '{base_map_type}'. Defaulting to 'force_8bit'.")
|
|
||||||
bit_depth_rule = 'force_8bit'
|
|
||||||
|
|
||||||
if bit_depth_rule == 'respect_inputs':
|
logger.info(f"SaveImageVariants: Determining target bit depth for map type: {final_internal_map_type} with policy: '{bit_depth_policy}'. Source bit depths: {source_bit_depth_info}")
|
||||||
|
|
||||||
|
if bit_depth_policy == "force_8bit":
|
||||||
|
target_bit_depth = 8
|
||||||
|
logger.debug(f"SaveImageVariants: Policy 'force_8bit' applied. Target bit depth: {target_bit_depth}")
|
||||||
|
elif bit_depth_policy == "force_16bit":
|
||||||
|
target_bit_depth = 16
|
||||||
|
logger.debug(f"SaveImageVariants: Policy 'force_16bit' applied. Target bit depth: {target_bit_depth}")
|
||||||
|
elif bit_depth_policy == "preserve":
|
||||||
# Check if any source bit depth is > 8, ignoring None
|
# Check if any source bit depth is > 8, ignoring None
|
||||||
if any(depth is not None and depth > 8 for depth in source_bit_depth_info):
|
if any(depth is not None and depth > 8 for depth in source_bit_depth_info):
|
||||||
target_bit_depth = 16
|
target_bit_depth = 16
|
||||||
|
logger.debug(f"SaveImageVariants: Policy 'preserve' applied, source > 8 found. Setting target_bit_depth = {target_bit_depth}")
|
||||||
else:
|
else:
|
||||||
target_bit_depth = 8
|
target_bit_depth = 8
|
||||||
logger.info(f"Bit depth rule 'respect_inputs' applied. Source bit depths: {source_bit_depth_info}. Target bit depth: {target_bit_depth}")
|
logger.debug(f"SaveImageVariants: Policy 'preserve' applied, no source > 8 found. Setting target_bit_depth = {target_bit_depth}")
|
||||||
else: # force_8bit
|
elif bit_depth_policy == "" or bit_depth_policy not in ["force_8bit", "force_16bit", "preserve"]:
|
||||||
target_bit_depth = 8
|
# Handle "" policy or any other unexpected/unknown value
|
||||||
logger.info(f"Bit depth rule 'force_8bit' applied. Target bit depth: {target_bit_depth}")
|
# For unknown/empty policies, apply the 'preserve' logic based on source bit depths.
|
||||||
|
if bit_depth_policy == "":
|
||||||
|
logger.warning(f"Empty bit_depth_policy for map type '{final_internal_map_type}'. Applying 'preserve' logic.")
|
||||||
|
else:
|
||||||
|
logger.warning(f"Unknown bit_depth_policy '{bit_depth_policy}' for map type '{final_internal_map_type}'. Applying 'preserve' logic.")
|
||||||
|
|
||||||
|
if any(depth is not None and depth > 8 for depth in source_bit_depth_info):
|
||||||
|
target_bit_depth = 16
|
||||||
|
logger.debug(f"SaveImageVariants: Applying 'preserve' logic, source > 8 found. Setting target_bit_depth = {target_bit_depth}")
|
||||||
|
else:
|
||||||
|
target_bit_depth = 8
|
||||||
|
logger.debug(f"SaveImageVariants: Applying 'preserve' logic, no source > 8 found. Setting target_bit_depth = {target_bit_depth}")
|
||||||
|
|
||||||
# 3. Determine Output File Format(s)
|
# 3. Determine Output File Format(s)
|
||||||
if target_bit_depth == 8:
|
if target_bit_depth == 8:
|
||||||
@ -118,26 +136,27 @@ def save_image_variants(
|
|||||||
|
|
||||||
current_output_ext = output_ext # Store the initial extension based on bit depth
|
current_output_ext = output_ext # Store the initial extension based on bit depth
|
||||||
|
|
||||||
logger.info(f"SaveImageVariants: Determined target bit depth: {target_bit_depth}, Initial output format: {current_output_ext} for map type {base_map_type}")
|
# Move this logging statement AFTER current_output_ext is assigned
|
||||||
|
logger.info(f"SaveImageVariants: Final determined target bit depth: {target_bit_depth}, Initial output format: {current_output_ext} for map type {final_internal_map_type}")
|
||||||
|
|
||||||
# 4. Generate and Save Resolution Variants
|
# 4. Generate and Save Resolution Variants
|
||||||
# Sort resolutions by max dimension descending
|
# Sort resolutions by max dimension descending
|
||||||
sorted_resolutions = sorted(image_resolutions.items(), key=lambda item: item[1], reverse=True)
|
sorted_resolutions = sorted(image_resolutions.items(), key=lambda item: item[1], reverse=True)
|
||||||
|
|
||||||
for res_key, res_max_dim in sorted_resolutions:
|
for res_key, res_max_dim in sorted_resolutions:
|
||||||
logger.info(f"SaveImageVariants: Processing variant {res_key} ({res_max_dim}px) for {base_map_type}")
|
logger.info(f"SaveImageVariants: Processing variant {res_key} ({res_max_dim}px) for {final_internal_map_type}")
|
||||||
|
|
||||||
# --- Prevent Upscaling ---
|
# --- Prevent Upscaling ---
|
||||||
# Skip this resolution variant if its target dimension is larger than the source image's largest dimension.
|
# Skip this resolution variant if its target dimension is larger than the source image's largest dimension.
|
||||||
if res_max_dim > source_max_dim:
|
if res_max_dim > source_max_dim:
|
||||||
logger.info(f"SaveImageVariants: Skipping variant {res_key} ({res_max_dim}px) for {base_map_type} because target resolution is larger than source ({source_max_dim}px).")
|
logger.info(f"SaveImageVariants: Skipping variant {res_key} ({res_max_dim}px) for {final_internal_map_type} because target resolution is larger than source ({source_max_dim}px).")
|
||||||
continue # Skip to the next resolution
|
continue # Skip to the next resolution
|
||||||
|
|
||||||
# Calculate target dimensions for valid variants (equal or smaller than source)
|
# Calculate target dimensions for valid variants (equal or smaller than source)
|
||||||
if source_max_dim == res_max_dim:
|
if source_max_dim == res_max_dim:
|
||||||
# Use source dimensions if target is equal
|
# Use source dimensions if target is equal
|
||||||
target_w_res, target_h_res = source_w, source_h
|
target_w_res, target_h_res = source_w, source_h
|
||||||
logger.info(f"SaveImageVariants: Using source resolution ({source_w}x{source_h}) for {res_key} variant of {base_map_type} as target matches source.")
|
logger.info(f"SaveImageVariants: Using source resolution ({source_w}x{source_h}) for {res_key} variant of {final_internal_map_type} as target matches source.")
|
||||||
else: # Downscale (source_max_dim > res_max_dim)
|
else: # Downscale (source_max_dim > res_max_dim)
|
||||||
# Downscale, maintaining aspect ratio
|
# Downscale, maintaining aspect ratio
|
||||||
aspect_ratio = source_w / source_h
|
aspect_ratio = source_w / source_h
|
||||||
@ -147,14 +166,14 @@ def save_image_variants(
|
|||||||
else:
|
else:
|
||||||
target_h_res = res_max_dim
|
target_h_res = res_max_dim
|
||||||
target_w_res = max(1, int(res_max_dim * aspect_ratio)) # Ensure width is at least 1
|
target_w_res = max(1, int(res_max_dim * aspect_ratio)) # Ensure width is at least 1
|
||||||
logger.info(f"SaveImageVariants: Calculated downscale for {base_map_type} {res_key}: from ({source_w}x{source_h}) to ({target_w_res}x{target_h_res})")
|
logger.info(f"SaveImageVariants: Calculated downscale for {final_internal_map_type} {res_key}: from ({source_w}x{source_h}) to ({target_w_res}x{target_h_res})")
|
||||||
|
|
||||||
|
|
||||||
# Resize source_image_data (only if necessary)
|
# Resize source_image_data (only if necessary)
|
||||||
if (target_w_res, target_h_res) == (source_w, source_h):
|
if (target_w_res, target_h_res) == (source_w, source_h):
|
||||||
# No resize needed if dimensions match
|
# No resize needed if dimensions match
|
||||||
variant_data = source_image_data.copy() # Copy to avoid modifying original if needed later
|
variant_data = source_image_data.copy() # Copy to avoid modifying original if needed later
|
||||||
logger.debug(f"SaveImageVariants: No resize needed for {base_map_type} {res_key}, using copy of source data.")
|
logger.debug(f"SaveImageVariants: No resize needed for {final_internal_map_type} {res_key}, using copy of source data.")
|
||||||
else:
|
else:
|
||||||
# Perform resize only if dimensions differ (i.e., downscaling)
|
# Perform resize only if dimensions differ (i.e., downscaling)
|
||||||
interpolation_method = cv2.INTER_AREA # Good for downscaling
|
interpolation_method = cv2.INTER_AREA # Good for downscaling
|
||||||
@ -162,21 +181,22 @@ def save_image_variants(
|
|||||||
variant_data = ipu.resize_image(source_image_data, target_w_res, target_h_res, interpolation=interpolation_method)
|
variant_data = ipu.resize_image(source_image_data, target_w_res, target_h_res, interpolation=interpolation_method)
|
||||||
if variant_data is None: # Check if resize failed
|
if variant_data is None: # Check if resize failed
|
||||||
raise ValueError("ipu.resize_image returned None")
|
raise ValueError("ipu.resize_image returned None")
|
||||||
logger.debug(f"SaveImageVariants: Resized variant data shape for {base_map_type} {res_key}: {variant_data.shape}")
|
logger.debug(f"SaveImageVariants: Resized variant data shape for {final_internal_map_type} {res_key}: {variant_data.shape}")
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"SaveImageVariants: Error resizing image for {base_map_type} {res_key} variant: {e}")
|
logger.error(f"SaveImageVariants: Error resizing image for {final_internal_map_type} {res_key} variant: {e}")
|
||||||
continue # Skip this variant if resizing fails
|
continue # Skip this variant if resizing fails
|
||||||
|
|
||||||
# Filename Construction
|
# Filename Construction
|
||||||
current_tokens = output_filename_pattern_tokens.copy()
|
current_tokens = output_filename_pattern_tokens.copy()
|
||||||
current_tokens['maptype'] = base_map_type
|
# Use the filename-friendly version for the filename token
|
||||||
|
current_tokens['maptype'] = get_filename_friendly_map_type(final_internal_map_type, file_type_defs)
|
||||||
current_tokens['resolution'] = res_key
|
current_tokens['resolution'] = res_key
|
||||||
|
|
||||||
# Determine final extension for this variant, considering JPG threshold
|
# Determine final extension for this variant, considering JPG threshold
|
||||||
final_variant_ext = current_output_ext
|
final_variant_ext = current_output_ext
|
||||||
|
|
||||||
# --- Start JPG Threshold Logging ---
|
# --- Start JPG Threshold Logging ---
|
||||||
logger.debug(f"SaveImageVariants: JPG Threshold Check for {base_map_type} {res_key}:")
|
logger.debug(f"SaveImageVariants: JPG Threshold Check for {final_internal_map_type} {res_key}:")
|
||||||
logger.debug(f" - target_bit_depth: {target_bit_depth}")
|
logger.debug(f" - target_bit_depth: {target_bit_depth}")
|
||||||
logger.debug(f" - resolution_threshold_for_jpg: {resolution_threshold_for_jpg}")
|
logger.debug(f" - resolution_threshold_for_jpg: {resolution_threshold_for_jpg}")
|
||||||
logger.debug(f" - target_w_res: {target_w_res}, target_h_res: {target_h_res}")
|
logger.debug(f" - target_w_res: {target_w_res}, target_h_res: {target_h_res}")
|
||||||
@ -198,7 +218,7 @@ def save_image_variants(
|
|||||||
|
|
||||||
if cond_bit_depth and cond_threshold_not_none and cond_res_exceeded and cond_is_png:
|
if cond_bit_depth and cond_threshold_not_none and cond_res_exceeded and cond_is_png:
|
||||||
final_variant_ext = 'jpg'
|
final_variant_ext = 'jpg'
|
||||||
logger.info(f"SaveImageVariants: Overriding 8-bit PNG to JPG for {base_map_type} {res_key} due to resolution {max(target_w_res, target_h_res)}px > threshold {resolution_threshold_for_jpg}px.")
|
logger.info(f"SaveImageVariants: Overriding 8-bit PNG to JPG for {final_internal_map_type} {res_key} due to resolution {max(target_w_res, target_h_res)}px > threshold {resolution_threshold_for_jpg}px.")
|
||||||
|
|
||||||
current_tokens['ext'] = final_variant_ext
|
current_tokens['ext'] = final_variant_ext
|
||||||
|
|
||||||
@ -216,14 +236,14 @@ def save_image_variants(
|
|||||||
continue # Skip this variant
|
continue # Skip this variant
|
||||||
|
|
||||||
output_path = output_base_directory / filename
|
output_path = output_base_directory / filename
|
||||||
logger.info(f"SaveImageVariants: Constructed output path for {base_map_type} {res_key}: {output_path}")
|
logger.info(f"SaveImageVariants: Constructed output path for {final_internal_map_type} {res_key}: {output_path}")
|
||||||
|
|
||||||
# Ensure parent directory exists
|
# Ensure parent directory exists
|
||||||
output_path.parent.mkdir(parents=True, exist_ok=True)
|
output_path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
logger.debug(f"SaveImageVariants: Ensured directory exists for {base_map_type} {res_key}: {output_path.parent}")
|
logger.debug(f"SaveImageVariants: Ensured directory exists for {final_internal_map_type} {res_key}: {output_path.parent}")
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"SaveImageVariants: Error constructing filepath for {base_map_type} {res_key} variant: {e}")
|
logger.error(f"SaveImageVariants: Error constructing filepath for {final_internal_map_type} {res_key} variant: {e}")
|
||||||
continue # Skip this variant if path construction fails
|
continue # Skip this variant if path construction fails
|
||||||
|
|
||||||
|
|
||||||
@ -232,11 +252,11 @@ def save_image_variants(
|
|||||||
if final_variant_ext == 'jpg': # Check against final_variant_ext
|
if final_variant_ext == 'jpg': # Check against final_variant_ext
|
||||||
save_params_cv2.append(cv2.IMWRITE_JPEG_QUALITY)
|
save_params_cv2.append(cv2.IMWRITE_JPEG_QUALITY)
|
||||||
save_params_cv2.append(jpg_quality)
|
save_params_cv2.append(jpg_quality)
|
||||||
logger.debug(f"SaveImageVariants: Using JPG quality: {jpg_quality} for {base_map_type} {res_key}")
|
logger.debug(f"SaveImageVariants: Using JPG quality: {jpg_quality} for {final_internal_map_type} {res_key}")
|
||||||
elif final_variant_ext == 'png': # Check against final_variant_ext
|
elif final_variant_ext == 'png': # Check against final_variant_ext
|
||||||
save_params_cv2.append(cv2.IMWRITE_PNG_COMPRESSION)
|
save_params_cv2.append(cv2.IMWRITE_PNG_COMPRESSION)
|
||||||
save_params_cv2.append(png_compression_level)
|
save_params_cv2.append(png_compression_level)
|
||||||
logger.debug(f"SaveImageVariants: Using PNG compression level: {png_compression_level} for {base_map_type} {res_key}")
|
logger.debug(f"SaveImageVariants: Using PNG compression level: {png_compression_level} for {final_internal_map_type} {res_key}")
|
||||||
# Add other format specific parameters if needed (e.g., TIFF compression)
|
# Add other format specific parameters if needed (e.g., TIFF compression)
|
||||||
|
|
||||||
|
|
||||||
@ -257,7 +277,8 @@ def save_image_variants(
|
|||||||
# Saving
|
# Saving
|
||||||
try:
|
try:
|
||||||
# ipu.save_image is expected to handle the actual cv2.imwrite call
|
# ipu.save_image is expected to handle the actual cv2.imwrite call
|
||||||
logger.debug(f"SaveImageVariants: Attempting to save {base_map_type} {res_key} to {output_path} with params {save_params_cv2}, target_dtype: {output_dtype_for_save}")
|
logger.debug(f"SaveImageVariants: Preparing to save {final_internal_map_type} {res_key}. Data dtype: {image_data_for_save.dtype}, shape: {image_data_for_save.shape}. Target dtype for ipu.save_image: {output_dtype_for_save}")
|
||||||
|
logger.debug(f"SaveImageVariants: Attempting to save {final_internal_map_type} {res_key} to {output_path} with params {save_params_cv2}, target_dtype: {output_dtype_for_save}")
|
||||||
success = ipu.save_image(
|
success = ipu.save_image(
|
||||||
str(output_path),
|
str(output_path),
|
||||||
image_data_for_save,
|
image_data_for_save,
|
||||||
@ -265,7 +286,7 @@ def save_image_variants(
|
|||||||
params=save_params_cv2
|
params=save_params_cv2
|
||||||
)
|
)
|
||||||
if success:
|
if success:
|
||||||
logger.info(f"SaveImageVariants: Successfully saved {base_map_type} {res_key} variant to {output_path}")
|
logger.info(f"SaveImageVariants: Successfully saved {final_internal_map_type} {res_key} variant to {output_path}")
|
||||||
# Collect details for the returned list
|
# Collect details for the returned list
|
||||||
saved_file_details.append({
|
saved_file_details.append({
|
||||||
'path': str(output_path),
|
'path': str(output_path),
|
||||||
@ -275,10 +296,10 @@ def save_image_variants(
|
|||||||
'dimensions': (target_w_res, target_h_res)
|
'dimensions': (target_w_res, target_h_res)
|
||||||
})
|
})
|
||||||
else:
|
else:
|
||||||
logger.error(f"SaveImageVariants: Failed to save {base_map_type} {res_key} variant to {output_path} (ipu.save_image returned False)")
|
logger.error(f"SaveImageVariants: Failed to save {final_internal_map_type} {res_key} variant to {output_path} (ipu.save_image returned False)")
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"SaveImageVariants: Error during ipu.save_image for {base_map_type} {res_key} variant to {output_path}: {e}", exc_info=True)
|
logger.error(f"SaveImageVariants: Error during ipu.save_image for {final_internal_map_type} {res_key} variant to {output_path}: {e}", exc_info=True)
|
||||||
# Continue to next variant even if one fails
|
# Continue to next variant even if one fails
|
||||||
|
|
||||||
|
|
||||||
@ -288,7 +309,7 @@ def save_image_variants(
|
|||||||
|
|
||||||
|
|
||||||
# 5. Return List of Saved File Details
|
# 5. Return List of Saved File Details
|
||||||
logger.info(f"Finished saving variants for map type: {base_map_type}. Saved {len(saved_file_details)} variants.")
|
logger.info(f"Finished saving variants for map type: {final_internal_map_type}. Saved {len(saved_file_details)} variants.")
|
||||||
return saved_file_details
|
return saved_file_details
|
||||||
|
|
||||||
# Optional Helper Functions (can be added here if needed)
|
# Optional Helper Functions (can be added here if needed)
|
||||||
|
|||||||
Loading…
x
Reference in New Issue
Block a user