Compare commits
No commits in common. "Dev" and "GUI-and-Configs" have entirely different histories.
Dev
...
GUI-and-Co
3
.gitattributes
vendored
3
.gitattributes
vendored
@ -1,3 +0,0 @@
|
||||
*.bin filter=lfs diff=lfs merge=lfs -text
|
||||
*.db filter=lfs diff=lfs merge=lfs -text
|
||||
*.sqlite3 filter=lfs diff=lfs merge=lfs -text
|
||||
@ -39,8 +39,7 @@
|
||||
"get_conport_schema",
|
||||
"get_recent_activity_summary",
|
||||
"semantic_search_conport",
|
||||
"search_system_patterns_fts",
|
||||
"update_decision"
|
||||
"search_system_patterns_fts"
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
@ -1,127 +0,0 @@
|
||||
# Code Review & Refactoring Plan: GUI File Table
|
||||
|
||||
**Objective:** To identify the root causes of file list discrepancies and persistent empty asset rows in the GUI file table, and to propose refactoring solutions for improved robustness and maintainability.
|
||||
|
||||
**Phase 1: In-Depth Code Review**
|
||||
|
||||
This phase will focus on understanding the current implementation and data flow within the relevant GUI modules.
|
||||
|
||||
1. **Identify Key Modules & Classes:**
|
||||
* **`gui/unified_view_model.py` ([`UnifiedViewModel`](gui/unified_view_model.py:1)):** This is the primary focus. We need to analyze:
|
||||
* How it loads and represents the hierarchical data (`SourceRule` -> `AssetRule` -> `FileRule`).
|
||||
* Methods responsible for updating the model with new data (e.g., from predictions or user edits).
|
||||
* Logic for adding, removing, and modifying rows, especially `AssetRule` and `FileRule` items.
|
||||
* How it handles data consistency when underlying data changes (e.g., after LLM processing or renaming operations).
|
||||
* Signal/slot connections related to data changes.
|
||||
* **`gui/asset_restructure_handler.py` ([`AssetRestructureHandler`](gui/asset_restructure_handler.py:1)):**
|
||||
* How it listens to changes in `AssetRule` names or `FileRule` target asset overrides.
|
||||
* The logic for moving `FileRule` items between `AssetRule` items.
|
||||
* The conditions under which it creates new `AssetRule` items or removes empty ones. This is critical for the "persistent empty asset rows" issue.
|
||||
* **`gui/llm_prediction_handler.py` ([`LLMPredictionHandler`](gui/llm_prediction_handler.py:1)):**
|
||||
* How it parses the LLM response.
|
||||
* How it translates the LLM's (potentially hallucinated) file list into the `SourceRule` structure.
|
||||
* How this new `SourceRule` data is passed to and integrated by the `UnifiedViewModel`. This is key for the "file list discrepancies" issue.
|
||||
* **`gui/prediction_handler.py` ([`RuleBasedPredictionHandler`](gui/prediction_handler.py:1)):**
|
||||
* Similar to the LLM handler, how it generates `SourceRule` data from presets.
|
||||
* How its output is integrated into the `UnifiedViewModel`, especially when "reinterpreting with a systematic approach" restores correct files.
|
||||
* **`gui/main_window.py` ([`MainWindow`](gui/main_window.py:1)) & `gui/main_panel_widget.py` ([`MainPanelWidget`](gui/main_panel_widget.py:1)):**
|
||||
* How these components instantiate and connect the `UnifiedViewModel`, `AssetRestructureHandler`, and prediction handlers.
|
||||
* Event handling related to loading data, triggering predictions, and user interactions that modify the table data.
|
||||
* **`rule_structure.py`:**
|
||||
* Review the definitions of `SourceRule`, `AssetRule`, and `FileRule` to ensure a clear understanding of the data being managed.
|
||||
|
||||
2. **Trace Data Flow & State Management:**
|
||||
* **Initial Load:** How is the initial list of files/assets loaded and represented in the `UnifiedViewModel`?
|
||||
* **LLM Processing:**
|
||||
* Trace the data flow from the LLM response -> `LLMPredictionHandler` -> `UnifiedViewModel`.
|
||||
* How does the `UnifiedViewModel` reconcile the LLM's version of the file list with any existing state? Is there a clear "source of truth" for the file list before and after LLM processing?
|
||||
* **Preset-Based Processing:**
|
||||
* Trace data flow from preset selection -> `RuleBasedPredictionHandler` -> `UnifiedViewModel`.
|
||||
* How does this "systematic approach" correct discrepancies? Does it fully replace the model's data or merge it?
|
||||
* **Renaming/Restructuring:**
|
||||
* Trace the events and actions from a user renaming an asset -> `AssetRestructureHandler` -> `UnifiedViewModel`.
|
||||
* How are `AssetRule` items checked for emptiness and subsequently removed (or not removed)?
|
||||
|
||||
3. **Analyze Event Handling and Signal/Slot Connections:**
|
||||
* Map out the key signals and slots between the `UnifiedViewModel`, `AssetRestructureHandler`, prediction handlers, and the main UI components.
|
||||
* Ensure that signals are emitted and slots are connected correctly to trigger necessary updates and prevent race conditions or missed updates.
|
||||
|
||||
**Phase 2: Identify Issues & Propose Refactoring Strategies**
|
||||
|
||||
Based on the review, we will pinpoint specific areas contributing to the reported problems and suggest improvements.
|
||||
|
||||
1. **For File List Discrepancies (especially post-LLM):**
|
||||
* **Potential Issue:** The `UnifiedViewModel` might be directly replacing its internal data with the LLM's output without proper validation or merging against the original input file list.
|
||||
* **Proposed Strategy:**
|
||||
* Establish a clear "source of truth" for the actual input files that remains independent of the LLM's interpretation.
|
||||
* When the LLM provides its categorized list, the `LLMPredictionHandler` or `UnifiedViewModel` should *map* the LLM's findings onto the *existing* source files rather than creating a new list from scratch based on LLM hallucinations.
|
||||
* If the LLM identifies files not in the original input, these should be flagged or handled as discrepancies, not added as if they were real.
|
||||
* If the LLM *misses* files from the original input, these should remain visible, perhaps marked as "uncategorized by LLM."
|
||||
|
||||
2. **For Persistent Empty Asset Rows:**
|
||||
* **Potential Issue:** The `AssetRestructureHandler`'s logic for detecting and removing empty `AssetRule` items might be flawed or not consistently triggered. It might not correctly count child `FileRule` items after a move, or the signal to check for emptiness might not always fire.
|
||||
* **Proposed Strategy:**
|
||||
* Review and strengthen the logic within `AssetRestructureHandler` that checks if an `AssetRule` is empty after its `FileRule` children are moved or its name changes.
|
||||
* Ensure that this check is reliably performed *after* all relevant model updates have completed.
|
||||
* Consider adding explicit methods to `UnifiedViewModel` or `AssetRule` to query if an asset group is truly empty (has no associated `FileRule` items).
|
||||
* Ensure that the `UnifiedViewModel` correctly emits signals that the `AssetRestructureHandler` can use to trigger cleanup of empty asset rows.
|
||||
|
||||
3. **General Robustness & Maintainability:**
|
||||
* **State Management:** Clarify state management within `UnifiedViewModel`. Ensure data consistency and minimize side effects.
|
||||
* **Modularity:** Ensure clear separation of concerns between the `UnifiedViewModel` (data representation), prediction handlers (data generation), and `AssetRestructureHandler` (data manipulation).
|
||||
* **Logging & Error Handling:** Improve logging in these critical sections to make troubleshooting easier. Add robust error handling for unexpected data states.
|
||||
* **Unit Tests:** Identify areas where unit tests could be added or improved to cover the scenarios causing these bugs, especially around model updates and restructuring.
|
||||
|
||||
**Phase 3: Documentation & Handoff**
|
||||
|
||||
1. Document the findings of the code review.
|
||||
2. Detail the agreed-upon refactoring plan.
|
||||
3. Prepare for handoff to a developer (e.g., by switching to "Code" mode).
|
||||
|
||||
**Visual Plan (Mermaid Diagram):**
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
subgraph GUI Interaction
|
||||
UserAction[User Action (Load Files, Rename Asset, Trigger LLM)]
|
||||
end
|
||||
|
||||
subgraph Prediction Layer
|
||||
LLM_Handler([`gui.llm_prediction_handler.LLMPredictionHandler`])
|
||||
Preset_Handler([`gui.prediction_handler.RuleBasedPredictionHandler`])
|
||||
end
|
||||
|
||||
subgraph Core GUI Logic
|
||||
MainWindow([`gui.main_window.MainWindow`])
|
||||
MainPanel([`gui.main_panel_widget.MainPanelWidget`])
|
||||
ViewModel([`gui.unified_view_model.UnifiedViewModel`])
|
||||
RestructureHandler([`gui.asset_restructure_handler.AssetRestructureHandler`])
|
||||
end
|
||||
|
||||
subgraph Data Structures
|
||||
RuleStruct([`rule_structure.py` <br> SourceRule, AssetRule, FileRule])
|
||||
end
|
||||
|
||||
UserAction --> MainWindow
|
||||
MainWindow --> MainPanel
|
||||
MainPanel -- Triggers Predictions --> LLM_Handler
|
||||
MainPanel -- Triggers Predictions --> Preset_Handler
|
||||
MainPanel -- Displays Data From --> ViewModel
|
||||
|
||||
LLM_Handler -- Provides SourceRule Data --> ViewModel
|
||||
Preset_Handler -- Provides SourceRule Data --> ViewModel
|
||||
|
||||
ViewModel -- Manages --> RuleStruct
|
||||
ViewModel -- Signals Changes --> RestructureHandler
|
||||
ViewModel -- Signals Changes --> MainPanel
|
||||
|
||||
|
||||
RestructureHandler -- Modifies --> ViewModel
|
||||
|
||||
%% Issues
|
||||
style LLM_Handler fill:#f9d,stroke:#333,stroke-width:2px %% Highlight LLM Handler for file list issue
|
||||
style ViewModel fill:#f9d,stroke:#333,stroke-width:2px %% Highlight ViewModel for both issues
|
||||
style RestructureHandler fill:#f9d,stroke:#333,stroke-width:2px %% Highlight Restructure Handler for empty row issue
|
||||
|
||||
note right of LLM_Handler: Potential source of file list discrepancies
|
||||
note right of RestructureHandler: Potential source of persistent empty asset rows
|
||||
@ -1,105 +0,0 @@
|
||||
# Bit Depth Terminology Refactoring Plan
|
||||
|
||||
## 1. Background
|
||||
|
||||
Currently, there's an inconsistency in how bit depth rules and settings are defined and used across the project:
|
||||
|
||||
* **`config/file_type_definitions.json`**: Uses `"bit_depth_rule"` with values like `"force_8bit"` and `"respect"`.
|
||||
* **`config/app_settings.json`**: (Within `MAP_MERGE_RULES`) uses `"output_bit_depth"` with values like `"respect_inputs"`.
|
||||
* **`processing/utils/image_saving_utils.py`**: Contains logic that attempts to handle `"respect_inputs"` but is currently unreachable, and the `"respect"` rule effectively defaults to 8-bit.
|
||||
|
||||
This plan aims to unify the terminology and correct the processing logic.
|
||||
|
||||
## 2. Proposed Unified Terminology
|
||||
|
||||
A new configuration key and a clear set of values will be adopted:
|
||||
|
||||
* **New Key**: `bit_depth_policy`
|
||||
* This key will replace `"bit_depth_rule"` in `file_type_definitions.json`.
|
||||
* This key will replace `"output_bit_depth"` in `app_settings.json` (for `MAP_MERGE_RULES`).
|
||||
* **Values for `bit_depth_policy`**:
|
||||
* `"force_8bit"`: Always output 8-bit.
|
||||
* `"force_16bit"`: Always output 16-bit.
|
||||
* `"preserve"`: If any source image (or any input to a merge operation) has a bit depth greater than 8-bit, the output will be 16-bit. Otherwise, the output will be 8-bit.
|
||||
* `""` (empty string or `null`): No specific bit depth policy applies (e.g., for non-image files like models or text files).
|
||||
|
||||
## 3. Refactoring Plan Details
|
||||
|
||||
### Phase 1: Configuration File Updates
|
||||
|
||||
1. **`config/file_type_definitions.json`**:
|
||||
* Rename all instances of the key `"bit_depth_rule"` to `"bit_depth_policy"`.
|
||||
* Update values:
|
||||
* `"force_8bit"` remains `"force_8bit"`.
|
||||
* `"respect"` changes to `"preserve"`.
|
||||
* `""` (empty string) remains `""`.
|
||||
2. **`config/app_settings.json`**:
|
||||
* Within each rule in the `MAP_MERGE_RULES` array, rename the key `"output_bit_depth"` to `"bit_depth_policy"`.
|
||||
* Update the value: `"respect_inputs"` changes to `"preserve"`.
|
||||
|
||||
### Phase 2: Code Update - `configuration.py`
|
||||
|
||||
1. Modify the `Configuration` class:
|
||||
* Rename the method `get_bit_depth_rule()` to `get_bit_depth_policy()`.
|
||||
* Update this method to read the new `"bit_depth_policy"` key from the loaded file type definitions.
|
||||
* Ensure it correctly handles and returns the new policy values (`"force_8bit"`, `"force_16bit"`, `"preserve"`, `""`).
|
||||
* The method should continue to provide a sensible default if a map type is not found or has an invalid policy.
|
||||
|
||||
### Phase 3: Code Update - `processing/utils/image_saving_utils.py`
|
||||
|
||||
1. Refactor the `save_image_variants` function:
|
||||
* It will receive the `bit_depth_policy` (e.g., `"preserve"`, `"force_8bit"`) via its `file_type_defs` argument (which originates from the `Configuration` object).
|
||||
* Correct the internal logic for determining `target_bit_depth` based on the `bit_depth_policy` argument:
|
||||
* If `bit_depth_policy == "force_8bit"`, then `target_bit_depth = 8`.
|
||||
* If `bit_depth_policy == "force_16bit"`, then `target_bit_depth = 16`.
|
||||
* If `bit_depth_policy == "preserve"`:
|
||||
* Examine the `source_bit_depth_info` argument (list of bit depths of input images).
|
||||
* If any source bit depth in `source_bit_depth_info` is greater than 8, then `target_bit_depth = 16`.
|
||||
* Otherwise (all source bit depths are 8 or less, or list is empty/all None), `target_bit_depth = 8`.
|
||||
* If `bit_depth_policy == ""` or is `null` (or any other unhandled value), a clear default behavior should be established (e.g., log a warning and default to `"preserve"` or skip bit depth adjustments if appropriate for the file type).
|
||||
|
||||
### Phase 4: Code Update - `processing/pipeline/stages/merged_task_processor.py`
|
||||
|
||||
1. This stage is largely unaffected in its core logic for collecting `input_source_bit_depths`.
|
||||
2. The `ProcessedMergedMapData` object it produces will continue to carry these `source_bit_depths`.
|
||||
3. When this data is later passed to the `SaveVariantsStage` (and subsequently to `save_image_variants`), the `internal_map_type` of the merged map (e.g., "MAP_NRMRGH") will be used. The `Configuration` object will provide its `bit_depth_policy` (which, after refactoring `file_type_definitions.json`, should be `"preserve"` for relevant merged maps).
|
||||
4. The refactored `save_image_variants` will then use this `"preserve"` policy along with the `source_bit_depth_info` (derived from the merge inputs) to correctly determine the output bit depth for the merged map.
|
||||
|
||||
### Phase 5: Review Other Code & Potential Impacts
|
||||
|
||||
1. Conduct a codebase search for any remaining direct usages of the old keys (`"bit_depth_rule"`, `"output_bit_depth"`) or their values.
|
||||
2. Update these locations to use the new `Configuration.get_bit_depth_policy()` method and the new `"bit_depth_policy"` key and values.
|
||||
3. Pay special attention to any prediction logic (e.g., in `gui/prediction_handler.py` or `gui/llm_prediction_handler.py`) if it currently considers or tries to infer bit depth rules.
|
||||
|
||||
## 4. Backward Compatibility & Migration
|
||||
|
||||
* This is a breaking change for existing user-customized configuration files (`file_type_definitions.json`, `app_settings.json`, and any custom presets).
|
||||
* **Recommended Approach**: Implement migration logic within the `Configuration` class's loading methods.
|
||||
* When loading `file_type_definitions.json`: If `"bit_depth_rule"` is found, convert its value (e.g., `"respect"` to `"preserve"`) and store it under the new `"bit_depth_policy"` key. Log a warning.
|
||||
* When loading `app_settings.json` (specifically `MAP_MERGE_RULES`): If `"output_bit_depth"` is found, convert its value (e.g., `"respect_inputs"` to `"preserve"`) and store it under `"bit_depth_policy"`. Log a warning.
|
||||
* This ensures the application can still function with older user configs while guiding users to update.
|
||||
|
||||
## 5. Visualized Logic for `save_image_variants` (Post-Refactor)
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
A[Start save_image_variants] --> B{Get bit_depth_policy for base_map_type};
|
||||
B --> C{Policy is "force_8bit"?};
|
||||
C -- Yes --> D[target_bit_depth = 8];
|
||||
C -- No --> E{Policy is "force_16bit"?};
|
||||
E -- Yes --> F[target_bit_depth = 16];
|
||||
E -- No --> G{Policy is "preserve"?};
|
||||
G -- Yes --> H{Any source_bit_depth_info > 8?};
|
||||
H -- Yes --> I[target_bit_depth = 16];
|
||||
H -- No --> J[target_bit_depth = 8];
|
||||
G -- No --> K[Log warning: Unknown policy or "" policy, default to 8-bit or handle as per type];
|
||||
K --> D;
|
||||
D --> L[Proceed to save with 8-bit];
|
||||
F --> M[Proceed to save with 16-bit];
|
||||
I --> M;
|
||||
J --> L;
|
||||
L --> Z[End];
|
||||
M --> Z;
|
||||
```
|
||||
|
||||
This plan aims to create a more consistent, understandable, and correctly functioning system for handling bit depth across the application.
|
||||
@ -1,6 +1,6 @@
|
||||
# Asset Processing Utility
|
||||
|
||||
This tool streamlines the organisation of raw 3D asset source files from supplies (archives or folders) into a configurable library format.
|
||||
This tool streamlines the conversion of raw 3D asset source files from supplies (archives or folders) into a configurable library format.
|
||||
Goals include automatically updating Assets in various DCC's on import as well - minimising end user workload.
|
||||
|
||||
## Features
|
||||
|
||||
@ -1,280 +0,0 @@
|
||||
{
|
||||
"preset_name": "Dinesen",
|
||||
"supplier_name": "Dinesen",
|
||||
"notes": "Preset for standard Poliigon downloads. Prioritizes _xxx16 files. Moves previews etc. to Extra/. Assumes Metal/Rough workflow.",
|
||||
"source_naming": {
|
||||
"separator": "_",
|
||||
"part_indices": {
|
||||
"base_name": 0,
|
||||
"map_type": 1
|
||||
},
|
||||
"glossiness_keywords": [
|
||||
"GLOSS"
|
||||
]
|
||||
},
|
||||
"move_to_extra_patterns": [
|
||||
"*_Preview*",
|
||||
"*_Sphere*",
|
||||
"*_Cube*",
|
||||
"*_Flat*",
|
||||
"*.txt",
|
||||
"*.pdf",
|
||||
"*.url",
|
||||
"*.htm*",
|
||||
"*_Fabric.*",
|
||||
"*_DISP_*METALNESS*"
|
||||
],
|
||||
"map_type_mapping": [
|
||||
{
|
||||
"target_type": "MAP_COL",
|
||||
"keywords": [
|
||||
"COLOR*",
|
||||
"COL",
|
||||
"COL-*",
|
||||
"DIFFUSE",
|
||||
"DIF",
|
||||
"ALBEDO"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_NRM",
|
||||
"keywords": [
|
||||
"NORMAL*",
|
||||
"NORM*",
|
||||
"NRM*",
|
||||
"N"
|
||||
],
|
||||
"priority_keywords": [
|
||||
"*_NRM16*",
|
||||
"*_NM16*",
|
||||
"*Normal16*"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_ROUGH",
|
||||
"keywords": [
|
||||
"ROUGHNESS",
|
||||
"ROUGH"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_GLOSS",
|
||||
"keywords": [
|
||||
"GLOSS"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_AO",
|
||||
"keywords": [
|
||||
"AMBIENTOCCLUSION",
|
||||
"AO"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_DISP",
|
||||
"keywords": [
|
||||
"DISPLACEMENT",
|
||||
"DISP",
|
||||
"HEIGHT",
|
||||
"BUMP"
|
||||
],
|
||||
"priority_keywords": [
|
||||
"*_DISP16*",
|
||||
"*_DSP16*",
|
||||
"*DSP16*",
|
||||
"*DISP16*",
|
||||
"*Displacement16*",
|
||||
"*Height16*"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_REFL",
|
||||
"keywords": [
|
||||
"REFLECTION",
|
||||
"REFL",
|
||||
"SPECULAR",
|
||||
"SPEC"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_SSS",
|
||||
"keywords": [
|
||||
"SSS",
|
||||
"SUBSURFACE*"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_FUZZ",
|
||||
"keywords": [
|
||||
"FUZZ"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_IDMAP",
|
||||
"keywords": [
|
||||
"IDMAP"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_MASK",
|
||||
"keywords": [
|
||||
"OPAC*",
|
||||
"TRANSP*",
|
||||
"MASK*",
|
||||
"ALPHA*"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_METAL",
|
||||
"keywords": [
|
||||
"METAL*",
|
||||
"METALLIC"
|
||||
]
|
||||
}
|
||||
],
|
||||
"asset_category_rules": {
|
||||
"model_patterns": [
|
||||
"*.fbx",
|
||||
"*.obj",
|
||||
"*.blend",
|
||||
"*.mtl"
|
||||
],
|
||||
"decal_keywords": [
|
||||
"Decal"
|
||||
]
|
||||
},
|
||||
"archetype_rules": [
|
||||
[
|
||||
"Foliage",
|
||||
{
|
||||
"match_any": [
|
||||
"Plant",
|
||||
"Leaf",
|
||||
"Leaves",
|
||||
"Grass"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Fabric",
|
||||
{
|
||||
"match_any": [
|
||||
"Fabric",
|
||||
"Carpet",
|
||||
"Cloth",
|
||||
"Textile",
|
||||
"Leather"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Wood",
|
||||
{
|
||||
"match_any": [
|
||||
"Wood",
|
||||
"Timber",
|
||||
"Plank",
|
||||
"Board"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Metal",
|
||||
{
|
||||
"match_any": [
|
||||
"_Metal",
|
||||
"Steel",
|
||||
"Iron",
|
||||
"Gold",
|
||||
"Copper",
|
||||
"Chrome",
|
||||
"Aluminum",
|
||||
"Brass",
|
||||
"Bronze"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Concrete",
|
||||
{
|
||||
"match_any": [
|
||||
"Concrete",
|
||||
"Cement"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Ground",
|
||||
{
|
||||
"match_any": [
|
||||
"Ground",
|
||||
"Dirt",
|
||||
"Soil",
|
||||
"Mud",
|
||||
"Sand",
|
||||
"Gravel",
|
||||
"Asphalt",
|
||||
"Road",
|
||||
"Moss"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Stone",
|
||||
{
|
||||
"match_any": [
|
||||
"Stone",
|
||||
"Rock*",
|
||||
"Marble",
|
||||
"Granite",
|
||||
"Brick",
|
||||
"Tile",
|
||||
"Paving",
|
||||
"Pebble*",
|
||||
"Terrazzo",
|
||||
"Slate"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Plaster",
|
||||
{
|
||||
"match_any": [
|
||||
"Plaster",
|
||||
"Stucco",
|
||||
"Wall",
|
||||
"Paint"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Plastic",
|
||||
{
|
||||
"match_any": [
|
||||
"Plastic",
|
||||
"PVC",
|
||||
"Resin",
|
||||
"Rubber"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Glass",
|
||||
{
|
||||
"match_any": [
|
||||
"Glass"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
]
|
||||
]
|
||||
}
|
||||
@ -1,44 +0,0 @@
|
||||
{
|
||||
"ASSET_TYPE_DEFINITIONS": {
|
||||
"Surface": {
|
||||
"color": "#1f3e5d",
|
||||
"description": "A single Standard PBR material set for a surface.",
|
||||
"examples": [
|
||||
"Set: Wood01_COL + Wood01_NRM + WOOD01_ROUGH",
|
||||
"Set: Dif_Concrete + Normal_Concrete + Refl_Concrete"
|
||||
]
|
||||
},
|
||||
"Model": {
|
||||
"color": "#b67300",
|
||||
"description": "A set that contains models, can include PBR textureset",
|
||||
"examples": [
|
||||
"Single = Chair.fbx",
|
||||
"Set = Plant02.fbx + Plant02_col + Plant02_SSS"
|
||||
]
|
||||
},
|
||||
"Decal": {
|
||||
"color": "#68ac68",
|
||||
"description": "A alphamasked textureset",
|
||||
"examples": [
|
||||
"Set = DecalGraffiti01_Col + DecalGraffiti01_Alpha",
|
||||
"Single = DecalLeakStain03"
|
||||
]
|
||||
},
|
||||
"Atlas": {
|
||||
"color": "#955b8b",
|
||||
"description": "A texture, name usually hints that it's an atlas",
|
||||
"examples": [
|
||||
"Set = FoliageAtlas01_col + FoliageAtlas01_nrm"
|
||||
]
|
||||
},
|
||||
"UtilityMap": {
|
||||
"color": "#706b87",
|
||||
"description": "A useful image-asset consisting of only a single texture. Therefor each Utilitymap can only contain a single item.",
|
||||
"examples": [
|
||||
"Single = imperfection.png",
|
||||
"Single = smudges.png",
|
||||
"Single = scratches.tif"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -1,219 +0,0 @@
|
||||
{
|
||||
"FILE_TYPE_DEFINITIONS": {
|
||||
"MAP_COL": {
|
||||
"bit_depth_policy": "force_8bit",
|
||||
"color": "#ffaa00",
|
||||
"description": "Color/Albedo Map",
|
||||
"examples": [
|
||||
"_col.",
|
||||
"_basecolor.",
|
||||
"albedo",
|
||||
"diffuse"
|
||||
],
|
||||
"is_grayscale": false,
|
||||
"keybind": "C",
|
||||
"standard_type": "COL"
|
||||
},
|
||||
"MAP_NRM": {
|
||||
"bit_depth_policy": "preserve",
|
||||
"color": "#cca2f1",
|
||||
"description": "Normal Map",
|
||||
"examples": [
|
||||
"_nrm.",
|
||||
"_normal."
|
||||
],
|
||||
"is_grayscale": false,
|
||||
"keybind": "N",
|
||||
"standard_type": "NRM"
|
||||
},
|
||||
"MAP_NRMRGH": {
|
||||
"bit_depth_policy": "preserve",
|
||||
"color": "#abcdef",
|
||||
"description": "Normal + Roughness Merged Map",
|
||||
"examples": [],
|
||||
"is_grayscale": false,
|
||||
"keybind": "",
|
||||
"standard_type": "NRMRGH"
|
||||
},
|
||||
"MAP_METAL": {
|
||||
"bit_depth_policy": "force_8bit",
|
||||
"color": "#dcf4f2",
|
||||
"description": "Metalness Map",
|
||||
"examples": [
|
||||
"_metal.",
|
||||
"_met."
|
||||
],
|
||||
"is_grayscale": true,
|
||||
"keybind": "M",
|
||||
"standard_type": "METAL"
|
||||
},
|
||||
"MAP_ROUGH": {
|
||||
"bit_depth_policy": "force_8bit",
|
||||
"color": "#bfd6bf",
|
||||
"description": "Roughness Map",
|
||||
"examples": [
|
||||
"_rough.",
|
||||
"_rgh.",
|
||||
"_gloss"
|
||||
],
|
||||
"is_grayscale": true,
|
||||
"keybind": "R",
|
||||
"standard_type": "ROUGH"
|
||||
},
|
||||
"MAP_GLOSS": {
|
||||
"bit_depth_policy": "force_8bit",
|
||||
"color": "#d6bfd6",
|
||||
"description": "Glossiness Map",
|
||||
"examples": [
|
||||
"_gloss.",
|
||||
"_gls."
|
||||
],
|
||||
"is_grayscale": true,
|
||||
"keybind": "R",
|
||||
"standard_type": "GLOSS"
|
||||
},
|
||||
"MAP_AO": {
|
||||
"bit_depth_policy": "force_8bit",
|
||||
"color": "#e3c7c7",
|
||||
"description": "Ambient Occlusion Map",
|
||||
"examples": [
|
||||
"_ao.",
|
||||
"_ambientocclusion."
|
||||
],
|
||||
"is_grayscale": true,
|
||||
"keybind": "",
|
||||
"standard_type": "AO"
|
||||
},
|
||||
"MAP_DISP": {
|
||||
"bit_depth_policy": "preserve",
|
||||
"color": "#c6ddd5",
|
||||
"description": "Displacement/Height Map",
|
||||
"examples": [
|
||||
"_disp.",
|
||||
"_height."
|
||||
],
|
||||
"is_grayscale": true,
|
||||
"keybind": "D",
|
||||
"standard_type": "DISP"
|
||||
},
|
||||
"MAP_REFL": {
|
||||
"bit_depth_policy": "force_8bit",
|
||||
"color": "#c2c2b9",
|
||||
"description": "Reflection/Specular Map",
|
||||
"examples": [
|
||||
"_refl.",
|
||||
"_specular."
|
||||
],
|
||||
"is_grayscale": true,
|
||||
"keybind": "M",
|
||||
"standard_type": "REFL"
|
||||
},
|
||||
"MAP_SSS": {
|
||||
"bit_depth_policy": "preserve",
|
||||
"color": "#a0d394",
|
||||
"description": "Subsurface Scattering Map",
|
||||
"examples": [
|
||||
"_sss.",
|
||||
"_subsurface."
|
||||
],
|
||||
"is_grayscale": true,
|
||||
"keybind": "",
|
||||
"standard_type": "SSS"
|
||||
},
|
||||
"MAP_FUZZ": {
|
||||
"bit_depth_policy": "force_8bit",
|
||||
"color": "#a2d1da",
|
||||
"description": "Fuzz/Sheen Map",
|
||||
"examples": [
|
||||
"_fuzz.",
|
||||
"_sheen."
|
||||
],
|
||||
"is_grayscale": true,
|
||||
"keybind": "",
|
||||
"standard_type": "FUZZ"
|
||||
},
|
||||
"MAP_IDMAP": {
|
||||
"bit_depth_policy": "force_8bit",
|
||||
"color": "#ca8fb4",
|
||||
"description": "ID Map (for masking)",
|
||||
"examples": [
|
||||
"_id.",
|
||||
"_matid."
|
||||
],
|
||||
"is_grayscale": false,
|
||||
"keybind": "",
|
||||
"standard_type": "IDMAP"
|
||||
},
|
||||
"MAP_MASK": {
|
||||
"bit_depth_policy": "force_8bit",
|
||||
"color": "#c6e2bf",
|
||||
"description": "Generic Mask Map",
|
||||
"examples": [
|
||||
"_mask."
|
||||
],
|
||||
"is_grayscale": true,
|
||||
"keybind": "",
|
||||
"standard_type": "MASK"
|
||||
},
|
||||
"MAP_IMPERFECTION": {
|
||||
"bit_depth_policy": "force_8bit",
|
||||
"color": "#e6d1a6",
|
||||
"description": "Imperfection Map (scratches, dust)",
|
||||
"examples": [
|
||||
"_imp.",
|
||||
"_imperfection.",
|
||||
"splatter",
|
||||
"scratches",
|
||||
"smudges",
|
||||
"hairs",
|
||||
"fingerprints"
|
||||
],
|
||||
"is_grayscale": true,
|
||||
"keybind": "",
|
||||
"standard_type": "IMPERFECTION"
|
||||
},
|
||||
"MODEL": {
|
||||
"bit_depth_policy": "",
|
||||
"color": "#3db2bd",
|
||||
"description": "3D Model File",
|
||||
"examples": [
|
||||
".fbx",
|
||||
".obj"
|
||||
],
|
||||
"is_grayscale": false,
|
||||
"keybind": "",
|
||||
"standard_type": ""
|
||||
},
|
||||
"EXTRA": {
|
||||
"bit_depth_policy": "",
|
||||
"color": "#8c8c8c",
|
||||
"description": "asset previews or metadata",
|
||||
"examples": [
|
||||
".txt",
|
||||
".zip",
|
||||
"preview.",
|
||||
"_flat.",
|
||||
"_sphere.",
|
||||
"_Cube.",
|
||||
"thumb"
|
||||
],
|
||||
"is_grayscale": false,
|
||||
"keybind": "E",
|
||||
"standard_type": "EXTRA"
|
||||
},
|
||||
"FILE_IGNORE": {
|
||||
"bit_depth_policy": "",
|
||||
"color": "#673d35",
|
||||
"description": "File identified to be ignored due to prioritization rules (e.g., a lower bit-depth version when a higher one is present).",
|
||||
"category": "Ignored",
|
||||
"examples": [
|
||||
"Thumbs.db",
|
||||
".DS_Store"
|
||||
],
|
||||
"is_grayscale": false,
|
||||
"keybind": "X",
|
||||
"standard_type": "",
|
||||
"details": {}
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -1,267 +0,0 @@
|
||||
{
|
||||
"llm_predictor_examples": [
|
||||
{
|
||||
"input": "MessyTextures/Concrete_Damage_Set/concrete_col.png\nMessyTextures/Concrete_Damage_Set/concrete_N.png\nMessyTextures/Concrete_Damage_Set/concrete_rough.jpg\nMessyTextures/Concrete_Damage_Set/height_map_concrete.tif\nMessyTextures/Concrete_Damage_Set/Thumbs.db\nMessyTextures/Fabric_Pattern/pattern_01_diffuse.tga\nMessyTextures/Fabric_Pattern/pattern_01_ao.png\nMessyTextures/Fabric_Pattern/pattern_01_normal.png\nMessyTextures/Fabric_Pattern/notes.txt\nMessyTextures/Fabric_Pattern/variant_blue_diffuse.tga\nMessyTextures/Fabric_Pattern/fabric_flat.jpg",
|
||||
"output": {
|
||||
"individual_file_analysis": [
|
||||
{
|
||||
"relative_file_path": "MessyTextures/Concrete_Damage_Set/concrete_col.png",
|
||||
"classified_file_type": "MAP_COL",
|
||||
"proposed_asset_group_name": "Concrete_Damage_Set"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "MessyTextures/Concrete_Damage_Set/concrete_N.png",
|
||||
"classified_file_type": "MAP_NRM",
|
||||
"proposed_asset_group_name": "Concrete_Damage_Set"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "MessyTextures/Concrete_Damage_Set/concrete_rough.jpg",
|
||||
"classified_file_type": "MAP_ROUGH",
|
||||
"proposed_asset_group_name": "Concrete_Damage_Set"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "MessyTextures/Concrete_Damage_Set/height_map_concrete.tif",
|
||||
"classified_file_type": "MAP_DISP",
|
||||
"proposed_asset_group_name": "Concrete_Damage_Set"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "MessyTextures/Concrete_Damage_Set/Thumbs.db",
|
||||
"classified_file_type": "FILE_IGNORE",
|
||||
"proposed_asset_group_name": null
|
||||
},
|
||||
{
|
||||
"relative_file_path": "MessyTextures/Fabric_Pattern/pattern_01_diffuse.tga",
|
||||
"classified_file_type": "MAP_COL",
|
||||
"proposed_asset_group_name": "Fabric_Pattern_01"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "MessyTextures/Fabric_Pattern/pattern_01_ao.png",
|
||||
"classified_file_type": "MAP_AO",
|
||||
"proposed_asset_group_name": "Fabric_Pattern_01"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "MessyTextures/Fabric_Pattern/pattern_01_normal.png",
|
||||
"classified_file_type": "MAP_NRM",
|
||||
"proposed_asset_group_name": "Fabric_Pattern_01"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "MessyTextures/Fabric_Pattern/notes.txt",
|
||||
"classified_file_type": "EXTRA",
|
||||
"proposed_asset_group_name": "Fabric_Pattern_01"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "MessyTextures/Fabric_Pattern/variant_blue_diffuse.tga",
|
||||
"classified_file_type": "MAP_COL",
|
||||
"proposed_asset_group_name": "Fabric_Pattern_01"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "MessyTextures/Fabric_Pattern/fabric_flat.jpg",
|
||||
"classified_file_type": "EXTRA",
|
||||
"proposed_asset_group_name": "Fabric_Pattern_01"
|
||||
}
|
||||
],
|
||||
"asset_group_classifications": {
|
||||
"Concrete_Damage_Set": "Surface",
|
||||
"Fabric_Pattern_01": "Surface"
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"input": "SciFi_Drone/Drone_Model.fbx\nSciFi_Drone/Textures/Drone_BaseColor.png\nSciFi_Drone/Textures/Drone_Metallic.png\nSciFi_Drone/Textures/Drone_Roughness.png\nSciFi_Drone/Textures/Drone_Normal.png\nSciFi_Drone/Textures/Drone_Emissive.jpg\nSciFi_Drone/ReferenceImages/concept.jpg",
|
||||
"output": {
|
||||
"individual_file_analysis": [
|
||||
{
|
||||
"relative_file_path": "SciFi_Drone/Drone_Model.fbx",
|
||||
"classified_file_type": "MODEL",
|
||||
"proposed_asset_group_name": "SciFi_Drone"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "SciFi_Drone/Textures/Drone_BaseColor.png",
|
||||
"classified_file_type": "MAP_COL",
|
||||
"proposed_asset_group_name": "SciFi_Drone"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "SciFi_Drone/Textures/Drone_Metallic.png",
|
||||
"classified_file_type": "MAP_METAL",
|
||||
"proposed_asset_group_name": "SciFi_Drone"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "SciFi_Drone/Textures/Drone_Roughness.png",
|
||||
"classified_file_type": "MAP_ROUGH",
|
||||
"proposed_asset_group_name": "SciFi_Drone"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "SciFi_Drone/Textures/Drone_Normal.png",
|
||||
"classified_file_type": "MAP_NRM",
|
||||
"proposed_asset_group_name": "SciFi_Drone"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "SciFi_Drone/Textures/Drone_Emissive.jpg",
|
||||
"classified_file_type": "EXTRA",
|
||||
"proposed_asset_group_name": "SciFi_Drone"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "SciFi_Drone/ReferenceImages/concept.jpg",
|
||||
"classified_file_type": "EXTRA",
|
||||
"proposed_asset_group_name": "SciFi_Drone"
|
||||
}
|
||||
],
|
||||
"asset_group_classifications": {
|
||||
"SciFi_Drone": "Model"
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"input": "21_hairs_deposits.tif\n22_hairs_fabric.tif\n23_hairs_fibres.tif\n24_hairs_fibres.tif\n25_bonus_isolatedFingerprints.tif\n26_bonus_isolatedPalmprint.tif\n27_metal_aluminum.tif\n28_metal_castIron.tif\n29_scratcehes_deposits_shapes.tif\n30_scratches_deposits.tif",
|
||||
"output": {
|
||||
"individual_file_analysis": [
|
||||
{
|
||||
"relative_file_path": "21_hairs_deposits.tif",
|
||||
"classified_file_type": "MAP_IMPERFECTION",
|
||||
"proposed_asset_group_name": "Hairs_Deposits_21"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "22_hairs_fabric.tif",
|
||||
"classified_file_type": "MAP_IMPERFECTION",
|
||||
"proposed_asset_group_name": "Hairs_Fabric_22"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "23_hairs_fibres.tif",
|
||||
"classified_file_type": "MAP_IMPERFECTION",
|
||||
"proposed_asset_group_name": "Hairs_Fibres_23"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "24_hairs_fibres.tif",
|
||||
"classified_file_type": "MAP_IMPERFECTION",
|
||||
"proposed_asset_group_name": "Hairs_Fibres_24"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "25_bonus_isolatedFingerprints.tif",
|
||||
"classified_file_type": "MAP_IMPERFECTION",
|
||||
"proposed_asset_group_name": "Bonus_IsolatedFingerprints_25"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "26_bonus_isolatedPalmprint.tif",
|
||||
"classified_file_type": "MAP_IMPERFECTION",
|
||||
"proposed_asset_group_name": "Bonus_IsolatedPalmprint_26"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "27_metal_aluminum.tif",
|
||||
"classified_file_type": "MAP_IMPERFECTION",
|
||||
"proposed_asset_group_name": "Metal_Aluminum_27"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "28_metal_castIron.tif",
|
||||
"classified_file_type": "MAP_IMPERFECTION",
|
||||
"proposed_asset_group_name": "Metal_CastIron_28"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "29_scratcehes_deposits_shapes.tif",
|
||||
"classified_file_type": "MAP_IMPERFECTION",
|
||||
"proposed_asset_group_name": "Scratches_Deposits_Shapes_29"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "30_scratches_deposits.tif",
|
||||
"classified_file_type": "MAP_IMPERFECTION",
|
||||
"proposed_asset_group_name": "Scratches_Deposits_30"
|
||||
}
|
||||
],
|
||||
"asset_group_classifications": {
|
||||
"Hairs_Deposits_21": "UtilityMap",
|
||||
"Hairs_Fabric_22": "UtilityMap",
|
||||
"Hairs_Fibres_23": "UtilityMap",
|
||||
"Hairs_Fibres_24": "UtilityMap",
|
||||
"Bonus_IsolatedFingerprints_25": "UtilityMap",
|
||||
"Bonus_IsolatedPalmprint_26": "UtilityMap",
|
||||
"Metal_Aluminum_27": "UtilityMap",
|
||||
"Metal_CastIron_28": "UtilityMap",
|
||||
"Scratches_Deposits_Shapes_29": "UtilityMap",
|
||||
"Scratches_Deposits_30": "UtilityMap"
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"input": "Part1/TextureSupply_Boards001_A_28x300cm-Albedo.jpg\nPart1/TextureSupply_Boards001_A_28x300cm-Normal.jpg\nPart1/TextureSupply_Boards001_B_28x300cm-Albedo.jpg\nPart1/TextureSupply_Boards001_B_28x300cm-Normal.jpg\nPart1/TextureSupply_Boards001_C_28x300cm-Albedo.jpg\nPart1/TextureSupply_Boards001_C_28x300cm-Normal.jpg\nPart1/TextureSupply_Boards001_D_28x300cm-Albedo.jpg\nPart1/TextureSupply_Boards001_D_28x300cm-Normal.jpg\nPart1/TextureSupply_Boards001_E_28x300cm-Albedo.jpg\nPart1/TextureSupply_Boards001_E_28x300cm-Normal.jpg\nPart1/TextureSupply_Boards001_F_28x300cm-Albedo.jpg\nPart1/TextureSupply_Boards001_F_28x300cm-Normal.jpg",
|
||||
"output": {
|
||||
"individual_file_analysis": [
|
||||
{
|
||||
"relative_file_path": "Part1/TextureSupply_Boards001_A_28x300cm-Albedo.jpg",
|
||||
"classified_file_type": "MAP_COL",
|
||||
"proposed_asset_group_name": "Boards001_A"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "Part1/TextureSupply_Boards001_A_28x300cm-Normal.jpg",
|
||||
"classified_file_type": "MAP_NRM",
|
||||
"proposed_asset_group_name": "Boards001_A"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "Part1/TextureSupply_Boards001_B_28x300cm-Albedo.jpg",
|
||||
"classified_file_type": "MAP_COL",
|
||||
"proposed_asset_group_name": "Boards001_B"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "Part1/TextureSupply_Boards001_B_28x300cm-Normal.jpg",
|
||||
"classified_file_type": "MAP_NRM",
|
||||
"proposed_asset_group_name": "Boards001_B"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "Part1/TextureSupply_Boards001_C_28x300cm-Albedo.jpg",
|
||||
"classified_file_type": "MAP_COL",
|
||||
"proposed_asset_group_name": "Boards001_C"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "Part1/TextureSupply_Boards001_C_28x300cm-Normal.jpg",
|
||||
"classified_file_type": "MAP_NRM",
|
||||
"proposed_asset_group_name": "Boards001_C"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "Part1/TextureSupply_Boards001_D_28x300cm-Albedo.jpg",
|
||||
"classified_file_type": "MAP_COL",
|
||||
"proposed_asset_group_name": "Boards001_D"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "Part1/TextureSupply_Boards001_D_28x300cm-Normal.jpg",
|
||||
"classified_file_type": "MAP_NRM",
|
||||
"proposed_asset_group_name": "Boards001_D"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "Part1/TextureSupply_Boards001_E_28x300cm-Albedo.jpg",
|
||||
"classified_file_type": "MAP_COL",
|
||||
"proposed_asset_group_name": "Boards001_E"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "Part1/TextureSupply_Boards001_E_28x300cm-Normal.jpg",
|
||||
"classified_file_type": "MAP_NRM",
|
||||
"proposed_asset_group_name": "Boards001_E"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "Part1/TextureSupply_Boards001_F_28x300cm-Albedo.jpg",
|
||||
"classified_file_type": "MAP_COL",
|
||||
"proposed_asset_group_name": "Boards001_F"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "Part1/TextureSupply_Boards001_F_28x300cm-Normal.jpg",
|
||||
"classified_file_type": "MAP_NRM",
|
||||
"proposed_asset_group_name": "Boards001_F"
|
||||
}
|
||||
],
|
||||
"asset_group_classifications": {
|
||||
"Boards001_A": "Surface",
|
||||
"Boards001_B": "Surface",
|
||||
"Boards001_C": "Surface",
|
||||
"Boards001_D": "Surface",
|
||||
"Boards001_E": "Surface",
|
||||
"Boards001_F": "Surface"
|
||||
}
|
||||
}
|
||||
}
|
||||
],
|
||||
"asset_type_definition_format": "{KEY} = {DESCRIPTION}, examples of content of {KEY} could be: {EXAMPLES}",
|
||||
"file_type_definition_format": "{KEY} = {DESCRIPTION}, examples of keywords for {KEY} could be: {EXAMPLES}",
|
||||
"llm_endpoint_url": "http://100.65.14.122:1234/v1/chat/completions",
|
||||
"llm_api_key": "",
|
||||
"llm_model_name": "qwen2.5-coder:3b",
|
||||
"llm_temperature": 0.5,
|
||||
"llm_request_timeout": 120,
|
||||
"llm_predictor_prompt": "You are an expert asset classification system. Your task is to analyze a list of file paths, understand their relationships based on naming and directory structure, and output a structured JSON object that classifies each file individually and then classifies the logical asset groups they belong to.\\n\\nDefinitions:\\n\\nAsset Types: These define the overall category of a logical asset group. Use one of the following keys when classifying asset groups. Each definition is provided as a formatted string (e.g., 'Surface = A single PBR material set..., examples: WoodFloor01, MetalPlate05'):\\n{ASSET_TYPE_DEFINITIONS}\\n\\n\\nFile Types: These define the specific purpose of each individual file. Use one of the following keys when classifying individual files. Each definition is provided as a formatted string (e.g., 'MAP_COL = Color/Albedo Map, examples: _col., _basecolor.'):\\n{FILE_TYPE_DEFINITIONS}\\n\\n\\nCore Task & Logic:\\n\\n1. **Individual File Analysis:**\\n * Examine each `relative_file_path` in the input `FILE_LIST`.\\n * For EACH file, determine its most likely `classified_file_type` using the `FILE_TYPE_DEFINITIONS`. Pay attention to filename suffixes, keywords, and extensions. Use `FILE_IGNORE` for files like `Thumbs.db` or `.DS_Store`. Use `EXTRA` for previews, metadata, or unidentifiable maps.\\n * For EACH file, propose a logical `proposed_asset_group_name` (string). This name should represent the asset the file likely belongs to, based on common base names (e.g., `WoodFloor01` from `WoodFloor01_col.png`, `WoodFloor01_nrm.png`) or directory structure (e.g., `SciFi_Drone` for files within that folder).\\n * Files that seem to be standalone utility maps (like `scratches.png`, `FlowMap.tif`) should get a unique group name derived from their filename (e.g., `Scratches`, `FlowMap`).\\n * If a file doesn't seem to belong to any logical group (e.g., a stray readme file in the root), you can propose `null` or a generic name like `Miscellaneous`.\\n * Be consistent with the proposed names for files belonging to the same logical asset.\\n * Populate the `individual_file_analysis` array with one object for *every* file in the input list, containing `relative_file_path`, `classified_file_type`, and `proposed_asset_group_name`.\\n\\n2. **Asset Group Classification:**\\n * Collect all unique, non-null `proposed_asset_group_name` values generated in the previous step.\\n * For EACH unique group name, determine the overall `asset_type` (using `ASSET_TYPE_DEFINITIONS`) based on the types of files assigned to that group name in the `individual_file_analysis`.\\n * Example: If files proposed as `AssetGroup1` include `MAP_COL`, `MAP_NRM`, `MAP_ROUGH`, classify `AssetGroup1` as `Surface`.\\n * Example: If files proposed as `AssetGroup2` include `MODEL` and texture maps, classify `AssetGroup2` as `Model`.\\n * Example: If `AssetGroup3` only has one file classified as `MAP_IMPERFECTION`, classify `AssetGroup3` as `UtilityMap`.\\n * Populate the `asset_group_classifications` dictionary, mapping each unique `proposed_asset_group_name` to its determined `asset_type`.\\n\\nInput File List:\\n\\ntext\\n{FILE_LIST}\\n\\n\\nOutput Format:\\n\\nYour response MUST be ONLY a single JSON object. You MAY include comments (using // or /* */) within the JSON structure for clarification if needed, but the core structure must be valid JSON. Do NOT include any text, explanations, or introductory phrases before or after the JSON object itself. Ensure all strings are correctly quoted and escaped.\\n\\nCRITICAL: The output JSON structure must strictly adhere to the following format:\\n\\n```json\\n{{\\n \"individual_file_analysis\": [\\n {{\\n // Optional comment about this file\\n \"relative_file_path\": \"string\", // Exact relative path from the input list\\n \"classified_file_type\": \"string\", // Key from FILE_TYPE_DEFINITIONS\\n \"proposed_asset_group_name\": \"string_or_null\" // Your suggested group name for this file\\n }}\\n // ... one object for EVERY file in the input list\\n ],\\n \"asset_group_classifications\": {{\\n // Dictionary mapping unique proposed group names to asset types\\n \"ProposedGroupName1\": \"string\", // Key: proposed_asset_group_name, Value: Key from ASSET_TYPE_DEFINITIONS\\n \"ProposedGroupName2\": \"string\"\\n // ... one entry for each unique, non-null proposed_asset_group_name\\n }}\\n}}\\n```\\n\\nExamples:\\n\\nHere are examples of input file lists and the desired JSON output, illustrating the two-part structure:\\n\\njson\\n[\\n {EXAMPLE_INPUT_OUTPUT_PAIRS}\\n]\\n\\n\\nNow, process the provided FILE_LIST and generate ONLY the JSON output according to these instructions. Remember to include an entry in `individual_file_analysis` for every single input file path."
|
||||
}
|
||||
@ -1,11 +0,0 @@
|
||||
{
|
||||
"Dimensiva": {
|
||||
"normal_map_type": "OpenGL"
|
||||
},
|
||||
"Dinesen": {
|
||||
"normal_map_type": "OpenGL"
|
||||
},
|
||||
"Poliigon": {
|
||||
"normal_map_type": "OpenGL"
|
||||
}
|
||||
}
|
||||
16
autotest.py
16
autotest.py
@ -589,6 +589,8 @@ class AutoTester(QObject):
|
||||
if not self._compare_list_of_rules(actual_value, expected_value, "FileRule", current_context, "file_path"):
|
||||
item_match = False
|
||||
else: # Regular field comparison
|
||||
if key == "preset_name":
|
||||
print(f"DEBUG: Comparing preset_name: Actual='{actual_value}', Expected='{expected_value}' for {item_type_name} ({current_context})")
|
||||
if actual_value != expected_value:
|
||||
# Handle None vs "None" string for preset_name specifically if it's a common issue
|
||||
if key == "preset_name" and actual_value is None and expected_value == "None":
|
||||
@ -802,7 +804,7 @@ class AutoTester(QObject):
|
||||
autotest_memory_handler.close() # MemoryHandler close is a no-op but good practice
|
||||
autotest_memory_handler = None
|
||||
|
||||
logger.info(f"Test {'succeeded' if success else 'failed'}. Cleaning up and exiting...you can ignore the non-zero exitcode") # KEEP INFO - Passes filter
|
||||
logger.info(f"Test {'succeeded' if success else 'failed'}. Cleaning up and exiting...") # KEEP INFO - Passes filter
|
||||
q_app = QCoreApplication.instance()
|
||||
if q_app:
|
||||
q_app.quit()
|
||||
@ -813,7 +815,7 @@ def main():
|
||||
"""Main function to run the autotest script."""
|
||||
cli_args = parse_arguments()
|
||||
# Logger is configured above, this will now use the new filtered setup
|
||||
#logger.info(f"Parsed CLI arguments: {cli_args}") # KEEP INFO - Passes filter
|
||||
logger.info(f"Parsed CLI arguments: {cli_args}") # KEEP INFO - Passes filter
|
||||
|
||||
# Clean and ensure output directory exists
|
||||
output_dir_path = Path(cli_args.outputdir)
|
||||
@ -854,15 +856,9 @@ def main():
|
||||
try:
|
||||
# Instantiate main.App() - this should create MainWindow but not show it by default
|
||||
# if App is designed to not show GUI unless app.main_window.show() is called.
|
||||
# Define a user config path for the test environment
|
||||
test_user_config_path = project_root / "TestFiles" / "TestConfig"
|
||||
test_user_config_path.mkdir(parents=True, exist_ok=True) # Ensure the directory exists
|
||||
|
||||
app_instance = App(user_config_path=str(test_user_config_path)) # Pass the path as a string
|
||||
# Load the preset after App initialization
|
||||
app_instance.load_preset(cli_args.preset)
|
||||
app_instance = App()
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to initialize main.App or load preset: {e}", exc_info=True)
|
||||
logger.error(f"Failed to initialize main.App: {e}", exc_info=True)
|
||||
sys.exit(1)
|
||||
|
||||
if not app_instance.main_window:
|
||||
|
||||
@ -1,167 +0,0 @@
|
||||
# Plan: Assessing Compilation of Asset Processor with PyInstaller and Cython
|
||||
|
||||
## Objective
|
||||
|
||||
To assess the feasibility and create a plan for compiling the Asset Processor project into standalone executables using PyInstaller, incorporating Cython for general speedup and source code obfuscation. A key requirement is to maintain user access to, and the ability to modify, configuration files (like `user_settings.json`, `asset_type_definitions.json`, etc.) and `Preset` files post-compilation.
|
||||
|
||||
---
|
||||
|
||||
## Phase 1: Initial Analysis & Information Gathering
|
||||
|
||||
* **Project Dependencies (from [`requirements.txt`](requirements.txt:1)):**
|
||||
* `opencv-python`
|
||||
* `numpy`
|
||||
* `openexr`
|
||||
* `PySide6`
|
||||
* `py7zr`
|
||||
* `rarfile`
|
||||
* `requests`
|
||||
* *Note: `PySide6`, `opencv-python`, and `openexr` may require special handling with PyInstaller (e.g., hidden imports, hooks).*
|
||||
* **Configuration Loading (based on [`configuration.py`](configuration.py:1)):**
|
||||
* Configuration files (`app_settings.json`, `llm_settings.json`, `asset_type_definitions.json`, `file_type_definitions.json`, `user_settings.json`, `suppliers.json`) are loaded from a `config/` subdirectory relative to [`configuration.py`](configuration.py:1).
|
||||
* Preset files are loaded from a `Presets/` subdirectory relative to [`configuration.py`](configuration.py:1).
|
||||
* `BASE_DIR` is `Path(__file__).parent`, which will refer to the bundled location in a PyInstaller build.
|
||||
* [`user_settings.json`](configuration.py:16) is designed for overrides and is a candidate for external management.
|
||||
* Saving functions write back to these relative paths, which needs adaptation.
|
||||
* **Potential Cython Candidates:**
|
||||
* Modules within the `processing/` directory.
|
||||
* Specifically: `processing/utils/image_processing_utils.py` and individual stage files in `processing/pipeline/stages/` (e.g., `alpha_extraction_to_mask.py`, `gloss_to_rough_conversion.py`, etc.).
|
||||
* Other modules (e.g., `processing/pipeline/orchestrator.py`) could be Cythonized primarily for obfuscation.
|
||||
* **User-Accessible Files (Defaults):**
|
||||
* The `config/` directory (containing `app_settings.json`, `asset_type_definitions.json`, `file_type_definitions.json`, `llm_settings.json`, `suppliers.json`).
|
||||
* The `Presets/` directory and its contents.
|
||||
|
||||
---
|
||||
|
||||
## Phase 2: Strategy Development
|
||||
|
||||
1. **Cython Strategy:**
|
||||
* **Build Integration:** Utilize a `setup.py` script with `setuptools` and `Cython.Build.cythonize` to compile `.py` files into C extensions (`.pyd` on Windows, `.so` on Linux/macOS).
|
||||
* **Candidate Prioritization:** Focus on `processing/` modules for performance gains and obfuscation.
|
||||
* **Compatibility & Challenges:**
|
||||
* GUI modules (PySide6) are generally left as Python.
|
||||
* Ensure compatibility with OpenCV, NumPy, and OpenEXR.
|
||||
* Address potential issues with highly dynamic Python code.
|
||||
* Consider iterative conversion to `.pyx` files with C-style type annotations for maximum performance in identified hot spots.
|
||||
* **Obfuscation:** The primary goal for many modules might be obfuscation rather than pure speedup.
|
||||
|
||||
2. **PyInstaller Strategy:**
|
||||
* **Bundle Type:** One-directory bundle (`--onedir`) is recommended for easier debugging and data file management.
|
||||
* **Data Files (`.spec` file `datas` section):**
|
||||
* Bundle default `config/` directory (containing `app_settings.json`, `asset_type_definitions.json`, `file_type_definitions.json`, `llm_settings.json`, `suppliers.json`).
|
||||
* Bundle default `Presets/` directory.
|
||||
* Include any other necessary GUI assets (icons, etc.).
|
||||
* Consider bundling the `blender_addon/` if it's to be deployed with the app.
|
||||
* **Hidden Imports & Hooks (`.spec` file):**
|
||||
* Add explicit `hiddenimports` for `PySide6`, `opencv-python`, `openexr`, and any other problematic libraries.
|
||||
* Utilize or create PyInstaller hooks if necessary.
|
||||
* **Console Window:** Disable for GUI application (`console=False`).
|
||||
|
||||
3. **User-Accessible Files & First-Time Setup Strategy:**
|
||||
* **First-Run Detection:** Application checks for a marker file or stored configuration path.
|
||||
* **First-Time Setup UI (PySide6 Dialog):**
|
||||
* **Configuration Location Choice:**
|
||||
* Option A (Recommended): Store in user profile (e.g., `Documents/AssetProcessor` or `AppData/Roaming/AssetProcessor`).
|
||||
* Option B (Advanced): User chooses a custom folder.
|
||||
* The application copies default `config/` (excluding `app_settings.json` but including other definition files) and `Presets/` to the chosen location.
|
||||
* The chosen path is saved.
|
||||
* **Key Application Settings Configuration (saved to `user_settings.json` in user's chosen location):**
|
||||
* Default Library Output Path (`OUTPUT_BASE_DIR`).
|
||||
* Asset Structure (`OUTPUT_DIRECTORY_PATTERN`).
|
||||
* Image Output Formats (`OUTPUT_FORMAT_16BIT_PRIMARY`, `OUTPUT_FORMAT_16BIT_FALLBACK`, `OUTPUT_FORMAT_8BIT`).
|
||||
* JPG Threshold (`RESOLUTION_THRESHOLD_FOR_JPG`).
|
||||
* Blender Paths (`DEFAULT_NODEGROUP_BLEND_PATH`, `DEFAULT_MATERIALS_BLEND_PATH`, `BLENDER_EXECUTABLE_PATH`).
|
||||
* **Configuration Loading Logic Modification ([`configuration.py`](configuration.py:1)):**
|
||||
* `BASE_DIR` for user-modifiable files will point to the user-chosen location.
|
||||
* `app_settings.json` (master defaults) always loaded from the bundle.
|
||||
* `user_settings.json` loaded from the user-chosen location, containing overrides.
|
||||
* Other definition files and `Presets` loaded from the user-chosen location, with a fallback/re-copy mechanism from bundled defaults if missing.
|
||||
* **Saving Logic Modification ([`configuration.py`](configuration.py:1)):**
|
||||
* All configuration saving functions will write to the user-chosen configuration location. Bundled defaults remain read-only post-installation.
|
||||
|
||||
---
|
||||
|
||||
## Phase 3: Outline of Combined Build Process
|
||||
|
||||
1. **Environment Setup (Developer):** Install Python, Cython, PyInstaller, and project dependencies.
|
||||
2. **Cythonization (`setup.py`):**
|
||||
* Create `setup.py` using `setuptools` and `Cython.Build.cythonize`.
|
||||
* List `.py` files/modules for compilation (e.g., `processing.utils.image_processing_utils`, `processing.pipeline.stages.*`).
|
||||
* Include `numpy.get_include()` if Cython files use NumPy C-API.
|
||||
* Run `python setup.py build_ext --inplace` to generate `.pyd`/`.so` files.
|
||||
3. **PyInstaller Packaging (`.spec` file):**
|
||||
* Generate initial `AssetProcessor.spec` with `pyinstaller --name AssetProcessor main.py`.
|
||||
* Modify `.spec` file:
|
||||
* `datas`: Add default `config/` and `Presets/` directories, and other assets.
|
||||
* `hiddenimports`: List modules for `PySide6`, `opencv-python`, etc.
|
||||
* `excludes`: Optionally exclude original `.py` files for Cythonized modules.
|
||||
* Set `onedir = True`, `onefile = False`, `console = False`.
|
||||
* Run `pyinstaller AssetProcessor.spec` to create `dist/AssetProcessor`.
|
||||
4. **Post-Build Steps (Optional):**
|
||||
* Clean up original `.py` files from `dist/` if obfuscation is paramount.
|
||||
* Archive `dist/AssetProcessor` for distribution (ZIP, installer).
|
||||
|
||||
---
|
||||
|
||||
## Phase 4: Distribution Structure
|
||||
|
||||
**Inside `dist/AssetProcessor/` (Distribution Package):**
|
||||
|
||||
* `AssetProcessor.exe` (or platform equivalent)
|
||||
* Core Python and library dependencies (DLLs/SOs)
|
||||
* Cythonized modules (`.pyd`/`.so` files, e.g., `processing/utils/image_processing_utils.pyd`)
|
||||
* Non-Cythonized Python modules (`.pyc` files)
|
||||
* Bundled default `config/` directory (with `app_settings.json`, `asset_type_definitions.json`, etc.)
|
||||
* Bundled default `Presets/` directory (with `_template.json`, `Dinesen.json`, etc.)
|
||||
* Other GUI assets (icons, etc.)
|
||||
* Potentially `blender_addon/` files if bundled.
|
||||
|
||||
**User's Configuration Directory (e.g., `Documents/AssetProcessor/`, created on first run):**
|
||||
|
||||
* `user_settings.json` (user's choices for paths, formats, etc.)
|
||||
* Copied `config/` directory (for user modification of `asset_type_definitions.json`, etc.)
|
||||
* Copied `Presets/` directory (for user modification/additions)
|
||||
* Marker file for first-time setup choice.
|
||||
|
||||
---
|
||||
|
||||
## Phase 5: Plan for Testing & Validation
|
||||
|
||||
1. **Core Functionality:** Test GUI operations, Directory Monitor, CLI (if applicable).
|
||||
2. **Configuration System:**
|
||||
* Verify first-time setup UI, config location choice, copying of defaults.
|
||||
* Confirm loading from and saving to the user's chosen config location.
|
||||
* Test modification of user configs and application's reflection of changes.
|
||||
3. **Dependency Checks:** Ensure bundled libraries (PySide6, OpenCV) function correctly.
|
||||
4. **Performance (Cython):** Basic comparison of critical operations (Python vs. Cythonized).
|
||||
5. **Obfuscation (Cython):** Verify absence of original `.py` files for Cythonized modules in distribution (if desired) and that `.pyd`/`.so` files are used.
|
||||
6. **Cross-Platform Testing:** Repeat build and test process on all target OS.
|
||||
|
||||
---
|
||||
|
||||
## Phase 6: Documentation Outline
|
||||
|
||||
1. **Developer/Build Documentation:**
|
||||
* Build environment setup.
|
||||
* `setup.py` (Cython) and `pyinstaller` command usage.
|
||||
* Structure of `setup.py` and `.spec` file, key configurations.
|
||||
* Troubleshooting common build issues.
|
||||
2. **User Documentation:**
|
||||
* First-time setup guide (config location, initial settings).
|
||||
* Managing user-specific configurations and presets (location, backup).
|
||||
* How to reset to default configurations.
|
||||
|
||||
---
|
||||
|
||||
## Phase 7: Risk Assessment & Mitigation (Brief)
|
||||
|
||||
* **Risk:** Cython compilation issues.
|
||||
* **Mitigation:** Incremental compilation, selective Cythonization.
|
||||
* **Risk:** PyInstaller packaging complexities.
|
||||
* **Mitigation:** Thorough testing, community hooks, iterative `.spec` refinement.
|
||||
* **Risk:** Logic errors in new configuration loading/saving.
|
||||
* **Mitigation:** Careful coding, detailed testing of config pathways.
|
||||
* **Risk:** Cython performance not meeting expectations.
|
||||
* **Mitigation:** Profile Python code first; focus Cython on CPU-bound loops.
|
||||
* **Risk:** Increased build complexity.
|
||||
* **Mitigation:** Automate build steps with scripts.
|
||||
@ -1,4 +1,5 @@
|
||||
{
|
||||
"TARGET_FILENAME_PATTERN": "{base_name}_{map_type}_{resolution}.{ext}",
|
||||
"RESPECT_VARIANT_MAP_TYPES": [
|
||||
"COL"
|
||||
],
|
||||
@ -37,7 +38,7 @@
|
||||
"G": 0.5,
|
||||
"B": 0.5
|
||||
},
|
||||
"bit_depth_policy": "preserve"
|
||||
"output_bit_depth": "respect_inputs"
|
||||
}
|
||||
],
|
||||
"CALCULATE_STATS_RESOLUTION": "1K",
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
{
|
||||
"FILE_TYPE_DEFINITIONS": {
|
||||
"MAP_COL": {
|
||||
"bit_depth_policy": "force_8bit",
|
||||
"bit_depth_rule": "force_8bit",
|
||||
"color": "#ffaa00",
|
||||
"description": "Color/Albedo Map",
|
||||
"examples": [
|
||||
@ -15,7 +15,7 @@
|
||||
"standard_type": "COL"
|
||||
},
|
||||
"MAP_NRM": {
|
||||
"bit_depth_policy": "preserve",
|
||||
"bit_depth_rule": "respect",
|
||||
"color": "#cca2f1",
|
||||
"description": "Normal Map",
|
||||
"examples": [
|
||||
@ -27,7 +27,7 @@
|
||||
"standard_type": "NRM"
|
||||
},
|
||||
"MAP_METAL": {
|
||||
"bit_depth_policy": "force_8bit",
|
||||
"bit_depth_rule": "force_8bit",
|
||||
"color": "#dcf4f2",
|
||||
"description": "Metalness Map",
|
||||
"examples": [
|
||||
@ -39,7 +39,7 @@
|
||||
"standard_type": "METAL"
|
||||
},
|
||||
"MAP_ROUGH": {
|
||||
"bit_depth_policy": "force_8bit",
|
||||
"bit_depth_rule": "force_8bit",
|
||||
"color": "#bfd6bf",
|
||||
"description": "Roughness Map",
|
||||
"examples": [
|
||||
@ -52,7 +52,7 @@
|
||||
"standard_type": "ROUGH"
|
||||
},
|
||||
"MAP_GLOSS": {
|
||||
"bit_depth_policy": "force_8bit",
|
||||
"bit_depth_rule": "force_8bit",
|
||||
"color": "#d6bfd6",
|
||||
"description": "Glossiness Map",
|
||||
"examples": [
|
||||
@ -64,7 +64,7 @@
|
||||
"standard_type": "GLOSS"
|
||||
},
|
||||
"MAP_AO": {
|
||||
"bit_depth_policy": "force_8bit",
|
||||
"bit_depth_rule": "force_8bit",
|
||||
"color": "#e3c7c7",
|
||||
"description": "Ambient Occlusion Map",
|
||||
"examples": [
|
||||
@ -76,7 +76,7 @@
|
||||
"standard_type": "AO"
|
||||
},
|
||||
"MAP_DISP": {
|
||||
"bit_depth_policy": "preserve",
|
||||
"bit_depth_rule": "respect",
|
||||
"color": "#c6ddd5",
|
||||
"description": "Displacement/Height Map",
|
||||
"examples": [
|
||||
@ -88,7 +88,7 @@
|
||||
"standard_type": "DISP"
|
||||
},
|
||||
"MAP_REFL": {
|
||||
"bit_depth_policy": "force_8bit",
|
||||
"bit_depth_rule": "force_8bit",
|
||||
"color": "#c2c2b9",
|
||||
"description": "Reflection/Specular Map",
|
||||
"examples": [
|
||||
@ -100,7 +100,7 @@
|
||||
"standard_type": "REFL"
|
||||
},
|
||||
"MAP_SSS": {
|
||||
"bit_depth_policy": "preserve",
|
||||
"bit_depth_rule": "respect",
|
||||
"color": "#a0d394",
|
||||
"description": "Subsurface Scattering Map",
|
||||
"examples": [
|
||||
@ -112,7 +112,7 @@
|
||||
"standard_type": "SSS"
|
||||
},
|
||||
"MAP_FUZZ": {
|
||||
"bit_depth_policy": "force_8bit",
|
||||
"bit_depth_rule": "force_8bit",
|
||||
"color": "#a2d1da",
|
||||
"description": "Fuzz/Sheen Map",
|
||||
"examples": [
|
||||
@ -124,7 +124,7 @@
|
||||
"standard_type": "FUZZ"
|
||||
},
|
||||
"MAP_IDMAP": {
|
||||
"bit_depth_policy": "force_8bit",
|
||||
"bit_depth_rule": "force_8bit",
|
||||
"color": "#ca8fb4",
|
||||
"description": "ID Map (for masking)",
|
||||
"examples": [
|
||||
@ -136,7 +136,7 @@
|
||||
"standard_type": "IDMAP"
|
||||
},
|
||||
"MAP_MASK": {
|
||||
"bit_depth_policy": "force_8bit",
|
||||
"bit_depth_rule": "force_8bit",
|
||||
"color": "#c6e2bf",
|
||||
"description": "Generic Mask Map",
|
||||
"examples": [
|
||||
@ -146,19 +146,8 @@
|
||||
"keybind": "",
|
||||
"standard_type": "MASK"
|
||||
},
|
||||
"MAP_NRMRGH": {
|
||||
"bit_depth_policy": "preserve",
|
||||
"color": "#abcdef",
|
||||
"description": "Packed Normal + Roughness + Metallic Map",
|
||||
"examples": [
|
||||
"_nrmrgh."
|
||||
],
|
||||
"is_grayscale": false,
|
||||
"keybind": "",
|
||||
"standard_type": "NRMRGH"
|
||||
},
|
||||
"MAP_IMPERFECTION": {
|
||||
"bit_depth_policy": "force_8bit",
|
||||
"bit_depth_rule": "force_8bit",
|
||||
"color": "#e6d1a6",
|
||||
"description": "Imperfection Map (scratches, dust)",
|
||||
"examples": [
|
||||
@ -175,7 +164,7 @@
|
||||
"standard_type": "IMPERFECTION"
|
||||
},
|
||||
"MODEL": {
|
||||
"bit_depth_policy": "",
|
||||
"bit_depth_rule": "",
|
||||
"color": "#3db2bd",
|
||||
"description": "3D Model File",
|
||||
"examples": [
|
||||
@ -187,7 +176,7 @@
|
||||
"standard_type": ""
|
||||
},
|
||||
"EXTRA": {
|
||||
"bit_depth_policy": "",
|
||||
"bit_depth_rule": "",
|
||||
"color": "#8c8c8c",
|
||||
"description": "asset previews or metadata",
|
||||
"examples": [
|
||||
@ -204,7 +193,7 @@
|
||||
"standard_type": "EXTRA"
|
||||
},
|
||||
"FILE_IGNORE": {
|
||||
"bit_depth_policy": "",
|
||||
"bit_depth_rule": "",
|
||||
"color": "#673d35",
|
||||
"description": "File identified to be ignored due to prioritization rules (e.g., a lower bit-depth version when a higher one is present).",
|
||||
"category": "Ignored",
|
||||
|
||||
843
configuration.py
843
configuration.py
@ -1,42 +1,26 @@
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
import shutil
|
||||
from pathlib import Path
|
||||
import logging
|
||||
import re
|
||||
import collections.abc
|
||||
from typing import Optional, Union
|
||||
from typing import Optional
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
# This BASE_DIR is primarily for fallback when not bundled or for locating bundled resources relative to the script.
|
||||
_SCRIPT_DIR = Path(__file__).resolve().parent
|
||||
BASE_DIR = Path(__file__).parent
|
||||
APP_SETTINGS_PATH = BASE_DIR / "config" / "app_settings.json"
|
||||
LLM_SETTINGS_PATH = BASE_DIR / "config" / "llm_settings.json"
|
||||
ASSET_TYPE_DEFINITIONS_PATH = BASE_DIR / "config" / "asset_type_definitions.json"
|
||||
FILE_TYPE_DEFINITIONS_PATH = BASE_DIR / "config" / "file_type_definitions.json"
|
||||
USER_SETTINGS_PATH = BASE_DIR / "config" / "user_settings.json"
|
||||
SUPPLIERS_CONFIG_PATH = BASE_DIR / "config" / "suppliers.json"
|
||||
PRESETS_DIR = BASE_DIR / "Presets"
|
||||
|
||||
class ConfigurationError(Exception):
|
||||
"""Custom exception for configuration loading errors."""
|
||||
pass
|
||||
|
||||
def _get_user_config_path_placeholder() -> Optional[Path]:
|
||||
"""
|
||||
Placeholder function. In a real scenario, this would retrieve the
|
||||
saved user configuration path (e.g., from a settings file).
|
||||
Returns None if not set, triggering first-time setup behavior.
|
||||
"""
|
||||
# For this subtask, we assume this path is determined externally and passed to Configuration.
|
||||
# If we were to implement the settings.ini check here, it would look like:
|
||||
# try:
|
||||
# app_data_dir = Path(os.getenv('APPDATA')) / "AssetProcessor"
|
||||
# settings_ini = app_data_dir / "settings.ini"
|
||||
# if settings_ini.exists():
|
||||
# with open(settings_ini, 'r') as f:
|
||||
# path_str = f.read().strip()
|
||||
# return Path(path_str)
|
||||
# except Exception:
|
||||
# return None
|
||||
return None
|
||||
|
||||
|
||||
def _get_base_map_type(target_map_string: str) -> str:
|
||||
"""Extracts the base map type (e.g., 'COL') from a potentially numbered string ('COL-1')."""
|
||||
# Use regex to find the leading alphabetical part
|
||||
@ -108,288 +92,53 @@ def _deep_merge_dicts(base_dict: dict, override_dict: dict) -> dict:
|
||||
|
||||
class Configuration:
|
||||
"""
|
||||
Loads and provides access to core settings combined with a specific preset,
|
||||
managing bundled and user-specific configuration paths.
|
||||
Loads and provides access to core settings combined with a specific preset.
|
||||
"""
|
||||
BASE_DIR_APP_BUNDLED_CONFIG_SUBDIR_NAME = "config"
|
||||
PRESETS_DIR_APP_BUNDLED_NAME = "Presets"
|
||||
USER_SETTINGS_FILENAME = "user_settings.json"
|
||||
APP_SETTINGS_FILENAME = "app_settings.json"
|
||||
ASSET_TYPE_DEFINITIONS_FILENAME = "asset_type_definitions.json"
|
||||
FILE_TYPE_DEFINITIONS_FILENAME = "file_type_definitions.json"
|
||||
LLM_SETTINGS_FILENAME = "llm_settings.json"
|
||||
SUPPLIERS_CONFIG_FILENAME = "suppliers.json"
|
||||
USER_CONFIG_SUBDIR_NAME = "config" # Subdirectory within user's chosen config root for most jsons
|
||||
USER_PRESETS_SUBDIR_NAME = "Presets" # Subdirectory within user's chosen config root for presets
|
||||
|
||||
def __init__(self, preset_name: str, base_dir_user_config: Optional[Path] = None, is_first_run_setup: bool = False):
|
||||
def __init__(self, preset_name: str):
|
||||
"""
|
||||
Loads core config, user overrides, and the specified preset file.
|
||||
|
||||
Args:
|
||||
preset_name: The name of the preset (without .json extension).
|
||||
base_dir_user_config: The root path for user-specific configurations.
|
||||
If None, loading of user-specific files will be skipped or may fail.
|
||||
is_first_run_setup: Flag indicating if this is part of the initial setup
|
||||
process where user config dir might be empty and fallbacks
|
||||
should not aggressively try to copy from bundle until UI confirms.
|
||||
|
||||
Raises:
|
||||
ConfigurationError: If critical configurations cannot be loaded/validated.
|
||||
ConfigurationError: If core config or preset cannot be loaded/validated.
|
||||
"""
|
||||
log.debug(f"Initializing Configuration with preset: '{preset_name}', user_config_dir: '{base_dir_user_config}', first_run_flag: {is_first_run_setup}")
|
||||
self._preset_filename_stem = preset_name
|
||||
self.base_dir_user_config: Optional[Path] = base_dir_user_config
|
||||
self.is_first_run_setup = is_first_run_setup
|
||||
self.base_dir_app_bundled: Path = self._determine_base_dir_app_bundled()
|
||||
log.debug(f"Initializing Configuration with preset filename stem: '{preset_name}'")
|
||||
self._preset_filename_stem = preset_name # Store the stem used for loading
|
||||
|
||||
log.info(f"Determined BASE_DIR_APP_BUNDLED: {self.base_dir_app_bundled}")
|
||||
log.info(f"Using BASE_DIR_USER_CONFIG: {self.base_dir_user_config}")
|
||||
# 1. Load core settings
|
||||
self._core_settings: dict = self._load_core_config()
|
||||
|
||||
# 1. Load core application settings (always from bundled)
|
||||
app_settings_path = self.base_dir_app_bundled / self.BASE_DIR_APP_BUNDLED_CONFIG_SUBDIR_NAME / self.APP_SETTINGS_FILENAME
|
||||
self._core_settings: dict = self._load_json_file(
|
||||
app_settings_path,
|
||||
is_critical=True,
|
||||
description="Core application settings"
|
||||
)
|
||||
# 2. Load asset type definitions
|
||||
self._asset_type_definitions: dict = self._load_asset_type_definitions()
|
||||
|
||||
# 2. Load user settings (from user config dir, if provided)
|
||||
user_settings_overrides: dict = {}
|
||||
if self.base_dir_user_config:
|
||||
user_settings_file_path = self.base_dir_user_config / self.USER_SETTINGS_FILENAME
|
||||
user_settings_overrides = self._load_json_file(
|
||||
user_settings_file_path,
|
||||
is_critical=False, # Not critical if missing, especially on first run
|
||||
description=f"User settings from {user_settings_file_path}"
|
||||
) or {} # Ensure it's a dict
|
||||
else:
|
||||
log.info(f"{self.USER_SETTINGS_FILENAME} not loaded: User config directory not set.")
|
||||
# 3. Load file type definitions
|
||||
self._file_type_definitions: dict = self._load_file_type_definitions()
|
||||
|
||||
# 3. Deep merge user settings onto core settings
|
||||
# 4. Load user settings
|
||||
user_settings_overrides: dict = self._load_user_settings()
|
||||
|
||||
# 5. Deep merge user settings onto core settings
|
||||
if user_settings_overrides:
|
||||
log.info(f"Applying user setting overrides to core settings.")
|
||||
log.info("Applying user setting overrides to core settings.")
|
||||
# _deep_merge_dicts modifies self._core_settings in place
|
||||
_deep_merge_dicts(self._core_settings, user_settings_overrides)
|
||||
|
||||
# 4. Load other definition files (from user config dir, with fallback from bundled)
|
||||
self._asset_type_definitions: dict = self._load_definition_file_with_fallback(
|
||||
self.ASSET_TYPE_DEFINITIONS_FILENAME, "ASSET_TYPE_DEFINITIONS"
|
||||
)
|
||||
self._file_type_definitions: dict = self._load_definition_file_with_fallback(
|
||||
self.FILE_TYPE_DEFINITIONS_FILENAME, "FILE_TYPE_DEFINITIONS"
|
||||
)
|
||||
# 6. Load LLM settings
|
||||
self._llm_settings: dict = self._load_llm_config()
|
||||
|
||||
# --- Migration Logic for file_type_definitions.json ---
|
||||
# Moved from _load_definition_file_with_fallback to ensure execution
|
||||
if isinstance(self._file_type_definitions, dict):
|
||||
log.debug(f"Applying migration logic for old bit depth terminology in {self.FILE_TYPE_DEFINITIONS_FILENAME}")
|
||||
for map_type_key, definition in self._file_type_definitions.items():
|
||||
if isinstance(definition, dict):
|
||||
# Check for old key "bit_depth_rule"
|
||||
if "bit_depth_rule" in definition:
|
||||
old_rule = definition.pop("bit_depth_rule") # Remove old key
|
||||
new_policy = old_rule # Start with the old value
|
||||
if old_rule == "respect":
|
||||
new_policy = "preserve" # Map old value to new
|
||||
elif old_rule == "respect_inputs":
|
||||
new_policy = "preserve" # Map old value to new (though this shouldn't be in FTD)
|
||||
elif old_rule == "":
|
||||
new_policy = "" # Keep empty string
|
||||
# "force_8bit" and "force_16bit" values remain the same
|
||||
|
||||
definition["bit_depth_policy"] = new_policy # Add new key with migrated value
|
||||
log.warning(f"Migrated old 'bit_depth_rule': '{old_rule}' to 'bit_depth_policy': '{new_policy}' for map type '{map_type_key}' in {self.FILE_TYPE_DEFINITIONS_FILENAME}. Please update your configuration file.")
|
||||
|
||||
# Also check for old value "respect" under the new key, in case the key was manually renamed but value wasn't
|
||||
if "bit_depth_policy" in definition and definition["bit_depth_policy"] == "respect":
|
||||
definition["bit_depth_policy"] = "preserve"
|
||||
log.warning(f"Migrated old 'bit_depth_policy' value 'respect' to 'preserve' for map type '{map_type_key}' in {self.FILE_TYPE_DEFINITIONS_FILENAME}. Please update your configuration file.")
|
||||
|
||||
# --- Migration Logic for app_settings.json (MAP_MERGE_RULES) ---
|
||||
# This needs to happen after core settings are loaded and potentially merged with user settings,
|
||||
# so it might be better placed in __init__ after the merge, or in a dedicated method called by __init__.
|
||||
# For now, let's focus on the file_type_definitions.json issue causing the autotest warnings.
|
||||
# The app_settings.json migration can be a separate step if needed, but the primary issue
|
||||
# seems to be with file_type_definitions.json loading in the test context.
|
||||
|
||||
self._llm_settings: dict = self._load_definition_file_with_fallback(
|
||||
self.LLM_SETTINGS_FILENAME, None # LLM settings might be flat (no root key)
|
||||
)
|
||||
self._suppliers_config: dict = self._load_definition_file_with_fallback(
|
||||
self.SUPPLIERS_CONFIG_FILENAME, None # Suppliers config is flat
|
||||
)
|
||||
|
||||
# 5. Load preset settings (from user config dir, with fallback from bundled)
|
||||
self._preset_settings: dict = self._load_preset_with_fallback(self._preset_filename_stem)
|
||||
# 7. Load preset settings (conceptually overrides combined base + user for shared keys)
|
||||
self._preset_settings: dict = self._load_preset(self._preset_filename_stem) # Use the stored stem
|
||||
|
||||
# Store the actual preset name read from the file content
|
||||
self.actual_internal_preset_name = self._preset_settings.get("preset_name", self._preset_filename_stem)
|
||||
log.info(f"Configuration instance: Loaded preset file '{self._preset_filename_stem}.json', internal preset_name is '{self.actual_internal_preset_name}'")
|
||||
|
||||
# 6. Validate and compile (after all base/user/preset settings are established)
|
||||
# 8. Validate and compile (after all base/user/preset settings are established)
|
||||
self._validate_configs()
|
||||
self._compile_regex_patterns()
|
||||
log.info(f"Configuration loaded successfully using preset: '{self.actual_internal_preset_name}'")
|
||||
|
||||
def _determine_base_dir_app_bundled(self) -> Path:
|
||||
"""Determines the base directory for bundled application resources."""
|
||||
if getattr(sys, 'frozen', False) and hasattr(sys, '_MEIPASS'):
|
||||
# Running in a PyInstaller bundle
|
||||
log.debug(f"Running as bundled app, _MEIPASS: {sys._MEIPASS}")
|
||||
return Path(sys._MEIPASS)
|
||||
else:
|
||||
# Running as a script
|
||||
log.debug(f"Running as script, using _SCRIPT_DIR: {_SCRIPT_DIR}")
|
||||
return _SCRIPT_DIR
|
||||
|
||||
def _ensure_dir_exists(self, dir_path: Path):
|
||||
"""Ensures a directory exists, creating it if necessary."""
|
||||
try:
|
||||
if not dir_path.exists():
|
||||
log.info(f"Directory not found, creating: {dir_path}")
|
||||
dir_path.mkdir(parents=True, exist_ok=True)
|
||||
elif not dir_path.is_dir():
|
||||
raise ConfigurationError(f"Expected directory but found file: {dir_path}")
|
||||
except OSError as e:
|
||||
raise ConfigurationError(f"Failed to create or access directory {dir_path}: {e}")
|
||||
|
||||
def _copy_default_if_missing(self, user_target_path: Path, bundled_source_subdir: str, filename: str) -> bool:
|
||||
"""
|
||||
Copies a default file from the bundled location to the user config directory
|
||||
if it's missing in the user directory. This is for post-first-time-setup fallback.
|
||||
"""
|
||||
if not self.base_dir_user_config:
|
||||
log.error(f"Cannot copy default for '{filename}': base_dir_user_config is not set.")
|
||||
return False
|
||||
|
||||
if user_target_path.exists():
|
||||
log.debug(f"User file '{user_target_path}' already exists. No copy needed from bundle.")
|
||||
return False
|
||||
|
||||
# This fallback copy should NOT happen during the initial UI-driven setup phase
|
||||
# where the UI is responsible for the first population of the user directory.
|
||||
# It's for subsequent runs where a user might have deleted a file.
|
||||
if self.is_first_run_setup:
|
||||
log.debug(f"'{filename}' missing in user dir during first_run_setup phase. UI should handle initial copy. Skipping fallback copy.")
|
||||
return False # File is missing, but UI should handle it.
|
||||
|
||||
bundled_file_path = self.base_dir_app_bundled / bundled_source_subdir / filename
|
||||
if not bundled_file_path.is_file():
|
||||
log.warning(f"Default bundled file '{bundled_file_path}' not found. Cannot copy to user location '{user_target_path}'.")
|
||||
return False
|
||||
|
||||
log.warning(f"User file '{user_target_path}' is missing. Attempting to restore from bundled default: '{bundled_file_path}'.")
|
||||
try:
|
||||
self._ensure_dir_exists(user_target_path.parent)
|
||||
shutil.copy2(bundled_file_path, user_target_path)
|
||||
log.info(f"Successfully copied '{bundled_file_path}' to '{user_target_path}'.")
|
||||
return True # File was copied
|
||||
except Exception as e:
|
||||
log.error(f"Failed to copy '{bundled_file_path}' to '{user_target_path}': {e}")
|
||||
return False # Copy failed
|
||||
|
||||
def _load_json_file(self, file_path: Optional[Path], is_critical: bool = False, description: str = "configuration") -> dict:
|
||||
"""Loads a JSON file, handling errors. Returns empty dict if not found and not critical."""
|
||||
if not file_path:
|
||||
if is_critical:
|
||||
raise ConfigurationError(f"Critical {description} file path is not defined.")
|
||||
log.debug(f"{description} file path is not defined. Returning empty dict.")
|
||||
return {}
|
||||
|
||||
log.debug(f"Attempting to load {description} from: {file_path}")
|
||||
if not file_path.is_file():
|
||||
if is_critical:
|
||||
raise ConfigurationError(f"Critical {description} file not found: {file_path}")
|
||||
log.info(f"{description} file not found: {file_path}. Returning empty dict.")
|
||||
return {}
|
||||
try:
|
||||
with open(file_path, 'r', encoding='utf-8') as f:
|
||||
settings = json.load(f)
|
||||
log.debug(f"{description} loaded successfully from {file_path}.")
|
||||
return settings
|
||||
except json.JSONDecodeError as e:
|
||||
msg = f"Failed to parse {description} file {file_path}: Invalid JSON - {e}"
|
||||
if is_critical: raise ConfigurationError(msg)
|
||||
log.warning(msg + ". Returning empty dict.")
|
||||
return {}
|
||||
except Exception as e:
|
||||
msg = f"Failed to read {description} file {file_path}: {e}"
|
||||
if is_critical: raise ConfigurationError(msg)
|
||||
log.warning(msg + ". Returning empty dict.")
|
||||
return {}
|
||||
|
||||
def _load_definition_file_with_fallback(self, filename: str, root_key: Optional[str] = None) -> dict:
|
||||
"""
|
||||
Loads a definition JSON file from the user config subdir.
|
||||
If not found and not first_run_setup, attempts to copy from bundled config subdir and then loads it.
|
||||
If base_dir_user_config is not set, loads directly from bundled (read-only).
|
||||
"""
|
||||
data = {}
|
||||
user_file_path = None
|
||||
|
||||
if self.base_dir_user_config:
|
||||
user_file_path = self.base_dir_user_config / self.USER_CONFIG_SUBDIR_NAME / filename
|
||||
data = self._load_json_file(user_file_path, is_critical=False, description=f"User {filename}")
|
||||
|
||||
if not data: # If not found or failed to load from user path
|
||||
# Attempt fallback copy only if not in the initial setup phase by UI
|
||||
# and if the file was genuinely missing (not a parse error for an existing file)
|
||||
if not user_file_path.exists() and not self.is_first_run_setup:
|
||||
if self._copy_default_if_missing(user_file_path, self.BASE_DIR_APP_BUNDLED_CONFIG_SUBDIR_NAME, filename):
|
||||
data = self._load_json_file(user_file_path, is_critical=False, description=f"User {filename} after copy")
|
||||
else:
|
||||
# No user_config_dir, load directly from bundled (read-only)
|
||||
log.warning(f"User config directory not set. Loading '{filename}' from bundled defaults (read-only).")
|
||||
bundled_path = self.base_dir_app_bundled / self.BASE_DIR_APP_BUNDLED_CONFIG_SUBDIR_NAME / filename
|
||||
data = self._load_json_file(bundled_path, is_critical=False, description=f"Bundled {filename}")
|
||||
|
||||
if not data:
|
||||
# If still no data, it's an issue, especially for critical definitions
|
||||
is_critical_def = filename in [self.ASSET_TYPE_DEFINITIONS_FILENAME, self.FILE_TYPE_DEFINITIONS_FILENAME]
|
||||
err_msg = f"Failed to load '{filename}' from user dir '{user_file_path if user_file_path else 'N/A'}' or bundled defaults. Critical functionality may be affected."
|
||||
if is_critical_def: raise ConfigurationError(err_msg)
|
||||
log.error(err_msg)
|
||||
return {}
|
||||
|
||||
if root_key:
|
||||
if root_key not in data:
|
||||
raise ConfigurationError(f"Key '{root_key}' not found in loaded {filename} data: {data.keys()}")
|
||||
content = data[root_key]
|
||||
# Ensure content is a dictionary if a root_key is expected to yield one
|
||||
if not isinstance(content, dict):
|
||||
raise ConfigurationError(f"Content under root key '{root_key}' in {filename} must be a dictionary, got {type(content)}.")
|
||||
return content
|
||||
return data # For flat files
|
||||
|
||||
|
||||
|
||||
def _load_preset_with_fallback(self, preset_name_stem: str) -> dict:
|
||||
"""
|
||||
Loads a preset JSON file from the user's Presets subdir.
|
||||
If not found and not first_run_setup, attempts to copy from bundled Presets and then loads it.
|
||||
If base_dir_user_config is not set, loads directly from bundled (read-only).
|
||||
"""
|
||||
preset_filename = f"{preset_name_stem}.json"
|
||||
preset_data = {}
|
||||
user_preset_file_path = None
|
||||
|
||||
if self.base_dir_user_config:
|
||||
user_presets_dir = self.base_dir_user_config / self.USER_PRESETS_SUBDIR_NAME
|
||||
user_preset_file_path = user_presets_dir / preset_filename
|
||||
preset_data = self._load_json_file(user_preset_file_path, is_critical=False, description=f"User preset '{preset_filename}'")
|
||||
|
||||
if not preset_data: # If not found or failed to load
|
||||
if not user_preset_file_path.exists() and not self.is_first_run_setup:
|
||||
if self._copy_default_if_missing(user_preset_file_path, self.PRESETS_DIR_APP_BUNDLED_NAME, preset_filename):
|
||||
preset_data = self._load_json_file(user_preset_file_path, is_critical=False, description=f"User preset '{preset_filename}' after copy")
|
||||
else:
|
||||
log.warning(f"User config directory not set. Loading preset '{preset_filename}' from bundled defaults (read-only).")
|
||||
bundled_presets_dir = self.base_dir_app_bundled / self.PRESETS_DIR_APP_BUNDLED_NAME
|
||||
bundled_preset_file_path = bundled_presets_dir / preset_filename
|
||||
# Presets are generally critical for operation if one is specified
|
||||
preset_data = self._load_json_file(bundled_preset_file_path, is_critical=True, description=f"Bundled preset '{preset_filename}'")
|
||||
|
||||
if not preset_data:
|
||||
raise ConfigurationError(f"Preset file '{preset_filename}' could not be loaded from user dir '{user_preset_file_path if user_preset_file_path else 'N/A'}' or bundled defaults.")
|
||||
return preset_data
|
||||
log.info(f"Configuration loaded successfully using preset: '{self.actual_internal_preset_name}'") # Changed self.preset_name to self.actual_internal_preset_name
|
||||
|
||||
|
||||
def _compile_regex_patterns(self):
|
||||
@ -488,6 +237,118 @@ class Configuration:
|
||||
log.debug("Finished compiling regex patterns.")
|
||||
|
||||
|
||||
def _load_core_config(self) -> dict:
|
||||
"""Loads settings from the core app_settings.json file."""
|
||||
log.debug(f"Loading core config from: {APP_SETTINGS_PATH}")
|
||||
if not APP_SETTINGS_PATH.is_file():
|
||||
raise ConfigurationError(f"Core configuration file not found: {APP_SETTINGS_PATH}")
|
||||
try:
|
||||
with open(APP_SETTINGS_PATH, 'r', encoding='utf-8') as f:
|
||||
settings = json.load(f)
|
||||
log.debug(f"Core config loaded successfully.")
|
||||
return settings
|
||||
except json.JSONDecodeError as e:
|
||||
raise ConfigurationError(f"Failed to parse core configuration file {APP_SETTINGS_PATH}: Invalid JSON - {e}")
|
||||
except Exception as e:
|
||||
raise ConfigurationError(f"Failed to read core configuration file {APP_SETTINGS_PATH}: {e}")
|
||||
|
||||
def _load_llm_config(self) -> dict:
|
||||
"""Loads settings from the llm_settings.json file."""
|
||||
log.debug(f"Loading LLM config from: {LLM_SETTINGS_PATH}")
|
||||
if not LLM_SETTINGS_PATH.is_file():
|
||||
# Log a warning but don't raise an error, allow fallback if possible
|
||||
log.warning(f"LLM configuration file not found: {LLM_SETTINGS_PATH}. LLM features might be disabled or use defaults.")
|
||||
return {}
|
||||
try:
|
||||
with open(LLM_SETTINGS_PATH, 'r', encoding='utf-8') as f:
|
||||
settings = json.load(f)
|
||||
log.debug(f"LLM config loaded successfully.")
|
||||
return settings
|
||||
except json.JSONDecodeError as e:
|
||||
log.error(f"Failed to parse LLM configuration file {LLM_SETTINGS_PATH}: Invalid JSON - {e}")
|
||||
return {}
|
||||
except Exception as e:
|
||||
log.error(f"Failed to read LLM configuration file {LLM_SETTINGS_PATH}: {e}")
|
||||
return {}
|
||||
|
||||
|
||||
def _load_preset(self, preset_name: str) -> dict:
|
||||
"""Loads the specified preset JSON file."""
|
||||
log.debug(f"Loading preset: '{preset_name}' from {PRESETS_DIR}")
|
||||
if not PRESETS_DIR.is_dir():
|
||||
raise ConfigurationError(f"Presets directory not found: {PRESETS_DIR}")
|
||||
|
||||
preset_file = PRESETS_DIR / f"{preset_name}.json"
|
||||
if not preset_file.is_file():
|
||||
raise ConfigurationError(f"Preset file not found: {preset_file}")
|
||||
|
||||
try:
|
||||
with open(preset_file, 'r', encoding='utf-8') as f:
|
||||
preset_data = json.load(f)
|
||||
log.debug(f"Preset '{preset_name}' loaded successfully.")
|
||||
return preset_data
|
||||
except json.JSONDecodeError as e:
|
||||
raise ConfigurationError(f"Failed to parse preset file {preset_file}: Invalid JSON - {e}")
|
||||
except Exception as e:
|
||||
raise ConfigurationError(f"Failed to read preset file {preset_file}: {e}")
|
||||
|
||||
def _load_asset_type_definitions(self) -> dict:
|
||||
"""Loads asset type definitions from the asset_type_definitions.json file."""
|
||||
log.debug(f"Loading asset type definitions from: {ASSET_TYPE_DEFINITIONS_PATH}")
|
||||
if not ASSET_TYPE_DEFINITIONS_PATH.is_file():
|
||||
raise ConfigurationError(f"Asset type definitions file not found: {ASSET_TYPE_DEFINITIONS_PATH}")
|
||||
try:
|
||||
with open(ASSET_TYPE_DEFINITIONS_PATH, 'r', encoding='utf-8') as f:
|
||||
data = json.load(f)
|
||||
if "ASSET_TYPE_DEFINITIONS" not in data:
|
||||
raise ConfigurationError(f"Key 'ASSET_TYPE_DEFINITIONS' not found in {ASSET_TYPE_DEFINITIONS_PATH}")
|
||||
settings = data["ASSET_TYPE_DEFINITIONS"]
|
||||
if not isinstance(settings, dict):
|
||||
raise ConfigurationError(f"'ASSET_TYPE_DEFINITIONS' in {ASSET_TYPE_DEFINITIONS_PATH} must be a dictionary.")
|
||||
log.debug(f"Asset type definitions loaded successfully.")
|
||||
return settings
|
||||
except json.JSONDecodeError as e:
|
||||
raise ConfigurationError(f"Failed to parse asset type definitions file {ASSET_TYPE_DEFINITIONS_PATH}: Invalid JSON - {e}")
|
||||
except Exception as e:
|
||||
raise ConfigurationError(f"Failed to read asset type definitions file {ASSET_TYPE_DEFINITIONS_PATH}: {e}")
|
||||
|
||||
def _load_file_type_definitions(self) -> dict:
|
||||
"""Loads file type definitions from the file_type_definitions.json file."""
|
||||
log.debug(f"Loading file type definitions from: {FILE_TYPE_DEFINITIONS_PATH}")
|
||||
if not FILE_TYPE_DEFINITIONS_PATH.is_file():
|
||||
raise ConfigurationError(f"File type definitions file not found: {FILE_TYPE_DEFINITIONS_PATH}")
|
||||
try:
|
||||
with open(FILE_TYPE_DEFINITIONS_PATH, 'r', encoding='utf-8') as f:
|
||||
data = json.load(f)
|
||||
if "FILE_TYPE_DEFINITIONS" not in data:
|
||||
raise ConfigurationError(f"Key 'FILE_TYPE_DEFINITIONS' not found in {FILE_TYPE_DEFINITIONS_PATH}")
|
||||
settings = data["FILE_TYPE_DEFINITIONS"]
|
||||
if not isinstance(settings, dict):
|
||||
raise ConfigurationError(f"'FILE_TYPE_DEFINITIONS' in {FILE_TYPE_DEFINITIONS_PATH} must be a dictionary.")
|
||||
log.debug(f"File type definitions loaded successfully.")
|
||||
return settings
|
||||
except json.JSONDecodeError as e:
|
||||
raise ConfigurationError(f"Failed to parse file type definitions file {FILE_TYPE_DEFINITIONS_PATH}: Invalid JSON - {e}")
|
||||
except Exception as e:
|
||||
raise ConfigurationError(f"Failed to read file type definitions file {FILE_TYPE_DEFINITIONS_PATH}: {e}")
|
||||
|
||||
def _load_user_settings(self) -> dict:
|
||||
"""Loads user override settings from config/user_settings.json."""
|
||||
log.debug(f"Attempting to load user settings from: {USER_SETTINGS_PATH}")
|
||||
if not USER_SETTINGS_PATH.is_file():
|
||||
log.info(f"User settings file not found: {USER_SETTINGS_PATH}. Proceeding without user overrides.")
|
||||
return {}
|
||||
try:
|
||||
with open(USER_SETTINGS_PATH, 'r', encoding='utf-8') as f:
|
||||
settings = json.load(f)
|
||||
log.info(f"User settings loaded successfully from {USER_SETTINGS_PATH}.")
|
||||
return settings
|
||||
except json.JSONDecodeError as e:
|
||||
log.warning(f"Failed to parse user settings file {USER_SETTINGS_PATH}: Invalid JSON - {e}. Using empty user settings.")
|
||||
return {}
|
||||
except Exception as e:
|
||||
log.warning(f"Failed to read user settings file {USER_SETTINGS_PATH}: {e}. Using empty user settings.")
|
||||
return {}
|
||||
|
||||
def _validate_configs(self):
|
||||
"""Performs basic validation checks on loaded settings."""
|
||||
@ -549,6 +410,8 @@ class Configuration:
|
||||
if not isinstance(prio_keyword, str):
|
||||
raise ConfigurationError(f"Preset file '{self._preset_filename_stem}.json': Priority keyword at index {prio_kw_index} in rule {index} ('{rule['target_type']}') must be a string.")
|
||||
|
||||
if not isinstance(self._core_settings.get('TARGET_FILENAME_PATTERN'), str):
|
||||
raise ConfigurationError("Core config 'TARGET_FILENAME_PATTERN' must be a string.")
|
||||
if not isinstance(self._core_settings.get('OUTPUT_DIRECTORY_PATTERN'), str):
|
||||
raise ConfigurationError("Core config 'OUTPUT_DIRECTORY_PATTERN' must be a string.")
|
||||
if not isinstance(self._core_settings.get('OUTPUT_FILENAME_PATTERN'), str):
|
||||
@ -582,14 +445,9 @@ class Configuration:
|
||||
|
||||
|
||||
@property
|
||||
def supplier_name(self) -> str: # From preset
|
||||
def supplier_name(self) -> str:
|
||||
return self._preset_settings.get('supplier_name', 'DefaultSupplier')
|
||||
|
||||
@property
|
||||
def suppliers_config(self) -> dict: # From suppliers.json
|
||||
"""Returns the loaded suppliers configuration."""
|
||||
return self._suppliers_config
|
||||
|
||||
@property
|
||||
def internal_display_preset_name(self) -> str:
|
||||
"""Returns the 'preset_name' field from within the loaded preset JSON,
|
||||
@ -725,27 +583,26 @@ class Configuration:
|
||||
"""Gets the list of map types that must always be saved losslessly."""
|
||||
return self._core_settings.get('FORCE_LOSSLESS_MAP_TYPES', [])
|
||||
|
||||
def get_bit_depth_policy(self, map_type_input: str) -> str:
|
||||
def get_bit_depth_rule(self, map_type_input: str) -> str:
|
||||
"""
|
||||
Gets the bit depth policy ('force_8bit', 'force_16bit', 'preserve', '') for a given map type identifier.
|
||||
Gets the bit depth rule ('respect', 'force_8bit', 'force_16bit') for a given map type identifier.
|
||||
The map_type_input can be an FTD key (e.g., "MAP_COL") or a suffixed FTD key (e.g., "MAP_COL-1").
|
||||
"""
|
||||
if not self._file_type_definitions: # Check if the attribute exists and is not empty
|
||||
log.warning("File type definitions not loaded. Cannot determine bit depth policy.")
|
||||
return "preserve" # Defaulting to 'preserve' as per refactor plan Phase 1 completion
|
||||
log.warning("File type definitions not loaded. Cannot determine bit depth rule.")
|
||||
return "respect"
|
||||
|
||||
file_type_definitions = self._file_type_definitions
|
||||
|
||||
# 1. Try direct match with map_type_input as FTD key
|
||||
definition = file_type_definitions.get(map_type_input)
|
||||
if definition:
|
||||
policy = definition.get('bit_depth_policy')
|
||||
# Valid policies include the empty string
|
||||
if policy in ['force_8bit', 'force_16bit', 'preserve', '']:
|
||||
return policy
|
||||
rule = definition.get('bit_depth_rule')
|
||||
if rule in ['respect', 'force_8bit', 'force_16bit']:
|
||||
return rule
|
||||
else:
|
||||
log.warning(f"FTD key '{map_type_input}' found, but 'bit_depth_policy' is missing or invalid: '{policy}'. Defaulting to 'preserve'.")
|
||||
return "preserve"
|
||||
log.warning(f"FTD key '{map_type_input}' found, but 'bit_depth_rule' is missing or invalid: '{rule}'. Defaulting to 'respect'.")
|
||||
return "respect"
|
||||
|
||||
# 2. Try to derive base FTD key by stripping common variant suffixes
|
||||
# Regex to remove trailing suffixes like -<digits>, -<alphanum>, _<alphanum>
|
||||
@ -753,17 +610,17 @@ class Configuration:
|
||||
if base_ftd_key_candidate != map_type_input:
|
||||
definition = file_type_definitions.get(base_ftd_key_candidate)
|
||||
if definition:
|
||||
policy = definition.get('bit_depth_policy')
|
||||
if policy in ['force_8bit', 'force_16bit', 'preserve', '']:
|
||||
log.debug(f"Derived base FTD key '{base_ftd_key_candidate}' from '{map_type_input}' and found bit depth policy: {policy}")
|
||||
return policy
|
||||
rule = definition.get('bit_depth_rule')
|
||||
if rule in ['respect', 'force_8bit', 'force_16bit']:
|
||||
log.debug(f"Derived base FTD key '{base_ftd_key_candidate}' from '{map_type_input}' and found bit depth rule: {rule}")
|
||||
return rule
|
||||
else:
|
||||
log.warning(f"Derived base FTD key '{base_ftd_key_candidate}' from '{map_type_input}', but 'bit_depth_policy' is missing/invalid: '{policy}'. Defaulting to 'preserve'.")
|
||||
return "preserve"
|
||||
log.warning(f"Derived base FTD key '{base_ftd_key_candidate}' from '{map_type_input}', but 'bit_depth_rule' is missing/invalid: '{rule}'. Defaulting to 'respect'.")
|
||||
return "respect"
|
||||
|
||||
# If no match found after trying direct and derived keys
|
||||
log.warning(f"Map type identifier '{map_type_input}' (or its derived base) not found in FILE_TYPE_DEFINITIONS. Defaulting bit depth policy to 'preserve'.")
|
||||
return "preserve"
|
||||
log.warning(f"Map type identifier '{map_type_input}' (or its derived base) not found in FILE_TYPE_DEFINITIONS. Defaulting bit depth rule to 'respect'.")
|
||||
return "respect"
|
||||
|
||||
def get_16bit_output_formats(self) -> tuple[str, str]:
|
||||
"""Gets the primary and fallback format names for 16-bit output."""
|
||||
@ -859,64 +716,9 @@ class Configuration:
|
||||
return self._core_settings.get('LOW_RESOLUTION_THRESHOLD', 512)
|
||||
|
||||
@property
|
||||
def FILE_TYPE_DEFINITIONS(self) -> dict: # Kept for compatibility if used directly
|
||||
def FILE_TYPE_DEFINITIONS(self) -> dict:
|
||||
return self._file_type_definitions
|
||||
|
||||
# --- Save Methods ---
|
||||
def _save_json_to_user_config(self, data_to_save: dict, filename: str, subdir: Optional[str] = None, is_root_key_data: Optional[str] = None):
|
||||
"""Helper to save a dictionary to a JSON file in the user config directory."""
|
||||
if not self.base_dir_user_config:
|
||||
raise ConfigurationError(f"Cannot save {filename}: User config directory (base_dir_user_config) is not set.")
|
||||
|
||||
target_dir = self.base_dir_user_config
|
||||
if subdir:
|
||||
target_dir = target_dir / subdir
|
||||
|
||||
self._ensure_dir_exists(target_dir)
|
||||
path = target_dir / filename
|
||||
|
||||
data_for_json = {is_root_key_data: data_to_save} if is_root_key_data else data_to_save
|
||||
|
||||
log.debug(f"Saving data to: {path}")
|
||||
try:
|
||||
with open(path, 'w', encoding='utf-8') as f:
|
||||
json.dump(data_for_json, f, indent=4)
|
||||
log.info(f"Data saved successfully to {path}")
|
||||
except Exception as e:
|
||||
log.error(f"Failed to save file {path}: {e}")
|
||||
raise ConfigurationError(f"Failed to save {filename}: {e}")
|
||||
|
||||
def save_user_settings(self, settings_dict: dict):
|
||||
"""Saves the provided settings dictionary to user_settings.json in the user config directory."""
|
||||
self._save_json_to_user_config(settings_dict, self.USER_SETTINGS_FILENAME)
|
||||
|
||||
def save_llm_settings(self, settings_dict: dict):
|
||||
"""Saves LLM settings to the user config directory's 'config' subdir."""
|
||||
self._save_json_to_user_config(settings_dict, self.LLM_SETTINGS_FILENAME, subdir=self.USER_CONFIG_SUBDIR_NAME)
|
||||
|
||||
def save_asset_type_definitions(self, data: dict):
|
||||
"""Saves asset type definitions to the user config directory's 'config' subdir."""
|
||||
self._save_json_to_user_config(data, self.ASSET_TYPE_DEFINITIONS_FILENAME, subdir=self.USER_CONFIG_SUBDIR_NAME, is_root_key_data="ASSET_TYPE_DEFINITIONS")
|
||||
|
||||
def save_file_type_definitions(self, data: dict):
|
||||
"""Saves file type definitions to the user config directory's 'config' subdir."""
|
||||
self._save_json_to_user_config(data, self.FILE_TYPE_DEFINITIONS_FILENAME, subdir=self.USER_CONFIG_SUBDIR_NAME, is_root_key_data="FILE_TYPE_DEFINITIONS")
|
||||
|
||||
def save_supplier_settings(self, data: dict):
|
||||
"""Saves supplier settings to the user config directory's 'config' subdir."""
|
||||
self._save_json_to_user_config(data, self.SUPPLIERS_CONFIG_FILENAME, subdir=self.USER_CONFIG_SUBDIR_NAME)
|
||||
|
||||
def save_preset(self, preset_data: dict, preset_name_stem: str):
|
||||
"""Saves a preset to the user config directory's 'Presets' subdir."""
|
||||
if not preset_name_stem:
|
||||
raise ConfigurationError("Preset name stem cannot be empty for saving.")
|
||||
preset_filename = f"{preset_name_stem}.json"
|
||||
# Ensure the preset_data itself contains the correct 'preset_name' field
|
||||
# or update it before saving if necessary.
|
||||
# For example: preset_data['preset_name'] = preset_name_stem
|
||||
self._save_json_to_user_config(preset_data, preset_filename, subdir=self.USER_PRESETS_SUBDIR_NAME)
|
||||
|
||||
|
||||
@property
|
||||
def keybind_config(self) -> dict[str, list[str]]:
|
||||
"""
|
||||
@ -940,60 +742,275 @@ class Configuration:
|
||||
# For now, we rely on the order they appear in the config.
|
||||
return keybinds
|
||||
|
||||
# The global load_base_config() is effectively replaced by Configuration.__init__
|
||||
# Global save/load functions for individual files are refactored to be methods
|
||||
# of the Configuration class or called by them, using instance paths.
|
||||
|
||||
# For example, to get a list of preset names, one might need a static method
|
||||
# or a function that knows about both bundled and user preset directories.
|
||||
def get_available_preset_names(base_dir_user_config: Optional[Path], base_dir_app_bundled: Path) -> list[str]:
|
||||
def load_base_config() -> dict:
|
||||
"""
|
||||
Gets a list of available preset names (stems) by looking in user presets
|
||||
and then bundled presets. User presets take precedence.
|
||||
Loads base configuration by merging app_settings.json, user_settings.json (if exists),
|
||||
asset_type_definitions.json, and file_type_definitions.json.
|
||||
Does not load presets or perform full validation beyond basic file loading.
|
||||
Returns a dictionary containing the merged settings. If app_settings.json
|
||||
fails to load, an empty dictionary is returned. If other files
|
||||
fail, errors are logged, and the function proceeds with what has been loaded.
|
||||
"""
|
||||
preset_names = set()
|
||||
base_settings = {}
|
||||
|
||||
# Check user presets first
|
||||
if base_dir_user_config:
|
||||
user_presets_dir = base_dir_user_config / Configuration.USER_PRESETS_SUBDIR_NAME
|
||||
if user_presets_dir.is_dir():
|
||||
for f in user_presets_dir.glob("*.json"):
|
||||
preset_names.add(f.stem)
|
||||
# 1. Load app_settings.json (critical)
|
||||
if not APP_SETTINGS_PATH.is_file():
|
||||
log.error(f"Critical: Base application settings file not found: {APP_SETTINGS_PATH}. Returning empty configuration.")
|
||||
return {}
|
||||
try:
|
||||
with open(APP_SETTINGS_PATH, 'r', encoding='utf-8') as f:
|
||||
base_settings = json.load(f)
|
||||
log.info(f"Successfully loaded base application settings from: {APP_SETTINGS_PATH}")
|
||||
except json.JSONDecodeError as e:
|
||||
log.error(f"Critical: Failed to parse base application settings file {APP_SETTINGS_PATH}: Invalid JSON - {e}. Returning empty configuration.")
|
||||
return {}
|
||||
except Exception as e:
|
||||
log.error(f"Critical: Failed to read base application settings file {APP_SETTINGS_PATH}: {e}. Returning empty configuration.")
|
||||
return {}
|
||||
|
||||
# Check bundled presets
|
||||
bundled_presets_dir = base_dir_app_bundled / Configuration.PRESETS_DIR_APP_BUNDLED_NAME
|
||||
if bundled_presets_dir.is_dir():
|
||||
for f in bundled_presets_dir.glob("*.json"):
|
||||
preset_names.add(f.stem) # Adds if not already present from user dir
|
||||
# 2. Attempt to load user_settings.json
|
||||
user_settings_overrides = {}
|
||||
if USER_SETTINGS_PATH.is_file():
|
||||
try:
|
||||
with open(USER_SETTINGS_PATH, 'r', encoding='utf-8') as f:
|
||||
user_settings_overrides = json.load(f)
|
||||
log.info(f"User settings loaded successfully for base_config from {USER_SETTINGS_PATH}.")
|
||||
except json.JSONDecodeError as e:
|
||||
log.warning(f"Failed to parse user settings file {USER_SETTINGS_PATH} for base_config: Invalid JSON - {e}. Proceeding without these user overrides.")
|
||||
except Exception as e:
|
||||
log.warning(f"Failed to read user settings file {USER_SETTINGS_PATH} for base_config: {e}. Proceeding without these user overrides.")
|
||||
|
||||
if not preset_names:
|
||||
log.warning("No preset files found in user or bundled preset directories.")
|
||||
# Consider adding a default/template preset if none are found, or ensure one always exists in bundle.
|
||||
# For now, return empty list.
|
||||
# 3. Deep merge user settings onto base_settings
|
||||
if user_settings_overrides:
|
||||
log.info("Applying user setting overrides to base_settings in load_base_config.")
|
||||
# _deep_merge_dicts modifies base_settings in place
|
||||
_deep_merge_dicts(base_settings, user_settings_overrides)
|
||||
|
||||
return sorted(list(preset_names))
|
||||
# 4. Load asset_type_definitions.json (non-critical, merge if successful)
|
||||
if not ASSET_TYPE_DEFINITIONS_PATH.is_file():
|
||||
log.error(f"Asset type definitions file not found: {ASSET_TYPE_DEFINITIONS_PATH}. Proceeding without it.")
|
||||
else:
|
||||
try:
|
||||
with open(ASSET_TYPE_DEFINITIONS_PATH, 'r', encoding='utf-8') as f:
|
||||
asset_defs_data = json.load(f)
|
||||
if "ASSET_TYPE_DEFINITIONS" in asset_defs_data:
|
||||
if isinstance(asset_defs_data["ASSET_TYPE_DEFINITIONS"], dict):
|
||||
# Merge into base_settings, which might already contain user overrides
|
||||
base_settings['ASSET_TYPE_DEFINITIONS'] = asset_defs_data["ASSET_TYPE_DEFINITIONS"]
|
||||
log.info(f"Successfully loaded and merged ASSET_TYPE_DEFINITIONS from: {ASSET_TYPE_DEFINITIONS_PATH}")
|
||||
else:
|
||||
log.error(f"Value under 'ASSET_TYPE_DEFINITIONS' in {ASSET_TYPE_DEFINITIONS_PATH} is not a dictionary. Skipping merge.")
|
||||
else:
|
||||
log.error(f"Key 'ASSET_TYPE_DEFINITIONS' not found in {ASSET_TYPE_DEFINITIONS_PATH}. Skipping merge.")
|
||||
except json.JSONDecodeError as e:
|
||||
log.error(f"Failed to parse asset type definitions file {ASSET_TYPE_DEFINITIONS_PATH}: Invalid JSON - {e}. Skipping merge.")
|
||||
except Exception as e:
|
||||
log.error(f"Failed to read asset type definitions file {ASSET_TYPE_DEFINITIONS_PATH}: {e}. Skipping merge.")
|
||||
|
||||
# Global functions like load_asset_definitions, save_asset_definitions etc.
|
||||
# are now instance methods of the Configuration class (e.g., self.save_asset_type_definitions).
|
||||
# If any external code was calling these global functions, it will need to be updated
|
||||
# to instantiate a Configuration object and call its methods, or these global
|
||||
# functions need to be carefully adapted to instantiate Configuration internally
|
||||
# or accept a Configuration instance.
|
||||
# 5. Load file_type_definitions.json (non-critical, merge if successful)
|
||||
if not FILE_TYPE_DEFINITIONS_PATH.is_file():
|
||||
log.error(f"File type definitions file not found: {FILE_TYPE_DEFINITIONS_PATH}. Proceeding without it.")
|
||||
else:
|
||||
try:
|
||||
with open(FILE_TYPE_DEFINITIONS_PATH, 'r', encoding='utf-8') as f:
|
||||
file_defs_data = json.load(f)
|
||||
if "FILE_TYPE_DEFINITIONS" in file_defs_data:
|
||||
if isinstance(file_defs_data["FILE_TYPE_DEFINITIONS"], dict):
|
||||
# Merge into base_settings
|
||||
base_settings['FILE_TYPE_DEFINITIONS'] = file_defs_data["FILE_TYPE_DEFINITIONS"]
|
||||
log.info(f"Successfully loaded and merged FILE_TYPE_DEFINITIONS from: {FILE_TYPE_DEFINITIONS_PATH}")
|
||||
else:
|
||||
log.error(f"Value under 'FILE_TYPE_DEFINITIONS' in {FILE_TYPE_DEFINITIONS_PATH} is not a dictionary. Skipping merge.")
|
||||
else:
|
||||
log.error(f"Key 'FILE_TYPE_DEFINITIONS' not found in {FILE_TYPE_DEFINITIONS_PATH}. Skipping merge.")
|
||||
except json.JSONDecodeError as e:
|
||||
log.error(f"Failed to parse file type definitions file {FILE_TYPE_DEFINITIONS_PATH}: Invalid JSON - {e}. Skipping merge.")
|
||||
except Exception as e:
|
||||
log.error(f"Failed to read file type definitions file {FILE_TYPE_DEFINITIONS_PATH}: {e}. Skipping merge.")
|
||||
|
||||
# For now, let's assume the primary interaction is via Configuration instance.
|
||||
# The old global functions below this point are effectively deprecated by the class methods.
|
||||
# I will remove them to avoid confusion and ensure all save/load operations
|
||||
# are managed through the Configuration instance with correct path context.
|
||||
return base_settings
|
||||
|
||||
# Removing old global load/save functions as their logic is now
|
||||
# part of the Configuration class or replaced by its new loading/saving mechanisms.
|
||||
# load_base_config() - Replaced by Configuration.__init__()
|
||||
# save_llm_config(settings_dict: dict) - Replaced by Configuration.save_llm_settings()
|
||||
# save_user_config(settings_dict: dict) - Replaced by Configuration.save_user_settings()
|
||||
# save_base_config(settings_dict: dict) - Bundled app_settings.json should be read-only.
|
||||
# load_asset_definitions() -> dict - Replaced by Configuration._load_definition_file_with_fallback() logic
|
||||
# save_asset_definitions(data: dict) - Replaced by Configuration.save_asset_type_definitions()
|
||||
# load_file_type_definitions() -> dict - Replaced by Configuration._load_definition_file_with_fallback() logic
|
||||
# save_file_type_definitions(data: dict) - Replaced by Configuration.save_file_type_definitions()
|
||||
# load_supplier_settings() -> dict - Replaced by Configuration._load_definition_file_with_fallback() logic
|
||||
# save_supplier_settings(data: dict) - Replaced by Configuration.save_supplier_settings()
|
||||
def save_llm_config(settings_dict: dict):
|
||||
"""
|
||||
Saves the provided LLM settings dictionary to llm_settings.json.
|
||||
"""
|
||||
log.debug(f"Saving LLM config to: {LLM_SETTINGS_PATH}")
|
||||
try:
|
||||
with open(LLM_SETTINGS_PATH, 'w', encoding='utf-8') as f:
|
||||
json.dump(settings_dict, f, indent=4)
|
||||
# Use info level for successful save
|
||||
log.info(f"LLM config saved successfully to {LLM_SETTINGS_PATH}")
|
||||
except Exception as e:
|
||||
log.error(f"Failed to save LLM configuration file {LLM_SETTINGS_PATH}: {e}")
|
||||
# Re-raise as ConfigurationError to signal failure upstream
|
||||
raise ConfigurationError(f"Failed to save LLM configuration: {e}")
|
||||
def save_user_config(settings_dict: dict):
|
||||
"""Saves the provided settings dictionary to user_settings.json."""
|
||||
log.debug(f"Saving user config to: {USER_SETTINGS_PATH}")
|
||||
try:
|
||||
# Ensure parent directory exists (though 'config/' should always exist)
|
||||
USER_SETTINGS_PATH.parent.mkdir(parents=True, exist_ok=True)
|
||||
with open(USER_SETTINGS_PATH, 'w', encoding='utf-8') as f:
|
||||
json.dump(settings_dict, f, indent=4)
|
||||
log.info(f"User config saved successfully to {USER_SETTINGS_PATH}")
|
||||
except Exception as e:
|
||||
log.error(f"Failed to save user configuration file {USER_SETTINGS_PATH}: {e}")
|
||||
raise ConfigurationError(f"Failed to save user configuration: {e}")
|
||||
def save_base_config(settings_dict: dict):
|
||||
"""
|
||||
Saves the provided settings dictionary to app_settings.json.
|
||||
"""
|
||||
log.debug(f"Saving base config to: {APP_SETTINGS_PATH}")
|
||||
try:
|
||||
with open(APP_SETTINGS_PATH, 'w', encoding='utf-8') as f:
|
||||
json.dump(settings_dict, f, indent=4)
|
||||
log.debug(f"Base config saved successfully.")
|
||||
except Exception as e:
|
||||
log.error(f"Failed to save base configuration file {APP_SETTINGS_PATH}: {e}")
|
||||
raise ConfigurationError(f"Failed to save configuration: {e}")
|
||||
|
||||
def load_asset_definitions() -> dict:
|
||||
"""
|
||||
Reads config/asset_type_definitions.json.
|
||||
Returns the dictionary under the "ASSET_TYPE_DEFINITIONS" key.
|
||||
Handles file not found or JSON errors gracefully (e.g., return empty dict, log error).
|
||||
"""
|
||||
log.debug(f"Loading asset type definitions from: {ASSET_TYPE_DEFINITIONS_PATH}")
|
||||
if not ASSET_TYPE_DEFINITIONS_PATH.is_file():
|
||||
log.error(f"Asset type definitions file not found: {ASSET_TYPE_DEFINITIONS_PATH}")
|
||||
return {}
|
||||
try:
|
||||
with open(ASSET_TYPE_DEFINITIONS_PATH, 'r', encoding='utf-8') as f:
|
||||
data = json.load(f)
|
||||
if "ASSET_TYPE_DEFINITIONS" not in data:
|
||||
log.error(f"Key 'ASSET_TYPE_DEFINITIONS' not found in {ASSET_TYPE_DEFINITIONS_PATH}")
|
||||
return {}
|
||||
settings = data["ASSET_TYPE_DEFINITIONS"]
|
||||
if not isinstance(settings, dict):
|
||||
log.error(f"'ASSET_TYPE_DEFINITIONS' in {ASSET_TYPE_DEFINITIONS_PATH} must be a dictionary.")
|
||||
return {}
|
||||
log.debug(f"Asset type definitions loaded successfully.")
|
||||
return settings
|
||||
except json.JSONDecodeError as e:
|
||||
log.error(f"Failed to parse asset type definitions file {ASSET_TYPE_DEFINITIONS_PATH}: Invalid JSON - {e}")
|
||||
return {}
|
||||
except Exception as e:
|
||||
log.error(f"Failed to read asset type definitions file {ASSET_TYPE_DEFINITIONS_PATH}: {e}")
|
||||
return {}
|
||||
|
||||
def save_asset_definitions(data: dict):
|
||||
"""
|
||||
Takes a dictionary (representing the content for the "ASSET_TYPE_DEFINITIONS" key).
|
||||
Writes it to config/asset_type_definitions.json under the root key "ASSET_TYPE_DEFINITIONS".
|
||||
Handles potential I/O errors.
|
||||
"""
|
||||
log.debug(f"Saving asset type definitions to: {ASSET_TYPE_DEFINITIONS_PATH}")
|
||||
try:
|
||||
with open(ASSET_TYPE_DEFINITIONS_PATH, 'w', encoding='utf-8') as f:
|
||||
json.dump({"ASSET_TYPE_DEFINITIONS": data}, f, indent=4)
|
||||
log.info(f"Asset type definitions saved successfully to {ASSET_TYPE_DEFINITIONS_PATH}")
|
||||
except Exception as e:
|
||||
log.error(f"Failed to save asset type definitions file {ASSET_TYPE_DEFINITIONS_PATH}: {e}")
|
||||
raise ConfigurationError(f"Failed to save asset type definitions: {e}")
|
||||
|
||||
def load_file_type_definitions() -> dict:
|
||||
"""
|
||||
Reads config/file_type_definitions.json.
|
||||
Returns the dictionary under the "FILE_TYPE_DEFINITIONS" key.
|
||||
Handles errors gracefully.
|
||||
"""
|
||||
log.debug(f"Loading file type definitions from: {FILE_TYPE_DEFINITIONS_PATH}")
|
||||
if not FILE_TYPE_DEFINITIONS_PATH.is_file():
|
||||
log.error(f"File type definitions file not found: {FILE_TYPE_DEFINITIONS_PATH}")
|
||||
return {}
|
||||
try:
|
||||
with open(FILE_TYPE_DEFINITIONS_PATH, 'r', encoding='utf-8') as f:
|
||||
data = json.load(f)
|
||||
if "FILE_TYPE_DEFINITIONS" not in data:
|
||||
log.error(f"Key 'FILE_TYPE_DEFINITIONS' not found in {FILE_TYPE_DEFINITIONS_PATH}")
|
||||
return {}
|
||||
settings = data["FILE_TYPE_DEFINITIONS"]
|
||||
if not isinstance(settings, dict):
|
||||
log.error(f"'FILE_TYPE_DEFINITIONS' in {FILE_TYPE_DEFINITIONS_PATH} must be a dictionary.")
|
||||
return {}
|
||||
log.debug(f"File type definitions loaded successfully.")
|
||||
return settings
|
||||
except json.JSONDecodeError as e:
|
||||
log.error(f"Failed to parse file type definitions file {FILE_TYPE_DEFINITIONS_PATH}: Invalid JSON - {e}")
|
||||
return {}
|
||||
except Exception as e:
|
||||
log.error(f"Failed to read file type definitions file {FILE_TYPE_DEFINITIONS_PATH}: {e}")
|
||||
return {}
|
||||
|
||||
def save_file_type_definitions(data: dict):
|
||||
"""
|
||||
Takes a dictionary (representing content for "FILE_TYPE_DEFINITIONS" key).
|
||||
Writes it to config/file_type_definitions.json under the root key "FILE_TYPE_DEFINITIONS".
|
||||
Handles errors.
|
||||
"""
|
||||
log.debug(f"Saving file type definitions to: {FILE_TYPE_DEFINITIONS_PATH}")
|
||||
try:
|
||||
with open(FILE_TYPE_DEFINITIONS_PATH, 'w', encoding='utf-8') as f:
|
||||
json.dump({"FILE_TYPE_DEFINITIONS": data}, f, indent=4)
|
||||
log.info(f"File type definitions saved successfully to {FILE_TYPE_DEFINITIONS_PATH}")
|
||||
except Exception as e:
|
||||
log.error(f"Failed to save file type definitions file {FILE_TYPE_DEFINITIONS_PATH}: {e}")
|
||||
raise ConfigurationError(f"Failed to save file type definitions: {e}")
|
||||
|
||||
def load_supplier_settings() -> dict:
|
||||
"""
|
||||
Reads config/suppliers.json.
|
||||
Returns the entire dictionary.
|
||||
Handles file not found (return empty dict) or JSON errors.
|
||||
If the loaded data is a list (old format), convert it in memory to the new
|
||||
dictionary format, defaulting normal_map_type to "OpenGL" for each supplier.
|
||||
"""
|
||||
log.debug(f"Loading supplier settings from: {SUPPLIERS_CONFIG_PATH}")
|
||||
if not SUPPLIERS_CONFIG_PATH.is_file():
|
||||
log.warning(f"Supplier settings file not found: {SUPPLIERS_CONFIG_PATH}. Returning empty dict.")
|
||||
return {}
|
||||
try:
|
||||
with open(SUPPLIERS_CONFIG_PATH, 'r', encoding='utf-8') as f:
|
||||
data = json.load(f)
|
||||
|
||||
if isinstance(data, list):
|
||||
log.warning(f"Supplier settings in {SUPPLIERS_CONFIG_PATH} is in the old list format. Converting to new dictionary format.")
|
||||
new_data = {}
|
||||
for supplier_name in data:
|
||||
if isinstance(supplier_name, str):
|
||||
new_data[supplier_name] = {"normal_map_type": "OpenGL"}
|
||||
else:
|
||||
log.warning(f"Skipping non-string item '{supplier_name}' during old format conversion of supplier settings.")
|
||||
log.info(f"Supplier settings converted to new format: {new_data}")
|
||||
return new_data
|
||||
|
||||
if not isinstance(data, dict):
|
||||
log.error(f"Supplier settings in {SUPPLIERS_CONFIG_PATH} must be a dictionary. Found {type(data)}. Returning empty dict.")
|
||||
return {}
|
||||
|
||||
log.debug(f"Supplier settings loaded successfully.")
|
||||
return data
|
||||
except json.JSONDecodeError as e:
|
||||
log.error(f"Failed to parse supplier settings file {SUPPLIERS_CONFIG_PATH}: Invalid JSON - {e}. Returning empty dict.")
|
||||
return {}
|
||||
except Exception as e:
|
||||
log.error(f"Failed to read supplier settings file {SUPPLIERS_CONFIG_PATH}: {e}. Returning empty dict.")
|
||||
return {}
|
||||
|
||||
def save_supplier_settings(data: dict):
|
||||
"""
|
||||
Takes a dictionary (in the new format).
|
||||
Writes it directly to config/suppliers.json.
|
||||
Handles errors.
|
||||
"""
|
||||
log.debug(f"Saving supplier settings to: {SUPPLIERS_CONFIG_PATH}")
|
||||
if not isinstance(data, dict):
|
||||
log.error(f"Data for save_supplier_settings must be a dictionary. Got {type(data)}.")
|
||||
raise ConfigurationError(f"Invalid data type for saving supplier settings: {type(data)}")
|
||||
try:
|
||||
with open(SUPPLIERS_CONFIG_PATH, 'w', encoding='utf-8') as f:
|
||||
json.dump(data, f, indent=2) # Using indent=2 as per the example for suppliers.json
|
||||
log.info(f"Supplier settings saved successfully to {SUPPLIERS_CONFIG_PATH}")
|
||||
except Exception as e:
|
||||
log.error(f"Failed to save supplier settings file {SUPPLIERS_CONFIG_PATH}: {e}")
|
||||
raise ConfigurationError(f"Failed to save supplier settings: {e}")
|
||||
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@ -973,26 +973,26 @@ class ConfigEditorDialog(QDialog):
|
||||
self.merge_rule_details_layout.addRow(group)
|
||||
self.merge_rule_widgets["defaults_table"] = defaults_table
|
||||
|
||||
# bit_depth_policy: QComboBox (Options: "preserve", "force_8bit", "force_16bit"). Label: "Bit Depth Policy".
|
||||
if "bit_depth_policy" in rule_data:
|
||||
label = QLabel("Bit Depth Policy:")
|
||||
widget = QComboBox()
|
||||
options = ["preserve", "force_8bit", "force_16bit"]
|
||||
widget.addItems(options)
|
||||
if rule_data["bit_depth_policy"] in options:
|
||||
widget.setCurrentText(rule_data["bit_depth_policy"])
|
||||
self.merge_rule_details_layout.addRow(label, widget)
|
||||
self.merge_rule_widgets["bit_depth_policy"] = widget
|
||||
|
||||
# output_bit_depth: QComboBox (Options: "respect_inputs", "force_8bit", "force_16bit"). Label: "Output Bit Depth".
|
||||
if "output_bit_depth" in rule_data:
|
||||
label = QLabel("Output Bit Depth:")
|
||||
widget = QComboBox()
|
||||
options = ["respect_inputs", "force_8bit", "force_16bit"]
|
||||
widget.addItems(options)
|
||||
if rule_data["output_bit_depth"] in options:
|
||||
widget.setCurrentText(rule_data["output_bit_depth"])
|
||||
self.merge_rule_details_layout.addRow(label, widget)
|
||||
self.merge_rule_widgets["output_bit_depth"] = widget
|
||||
|
||||
# Add stretch to push widgets to the top
|
||||
self.merge_rule_details_layout.addStretch()
|
||||
|
||||
|
||||
# Connect bit_depth_policy QComboBox to update rule data
|
||||
if "bit_depth_policy" in self.merge_rule_widgets and isinstance(self.merge_rule_widgets["bit_depth_policy"], QComboBox):
|
||||
self.merge_rule_widgets["bit_depth_policy"].currentTextChanged.connect(
|
||||
lambda text, key="bit_depth_policy": self.update_rule_data_simple_field(text, key)
|
||||
# Connect output_bit_depth QComboBox to update rule data
|
||||
if "output_bit_depth" in self.merge_rule_widgets and isinstance(self.merge_rule_widgets["output_bit_depth"], QComboBox):
|
||||
self.merge_rule_widgets["output_bit_depth"].currentTextChanged.connect(
|
||||
lambda text, key="output_bit_depth": self.update_rule_data_simple_field(text, key)
|
||||
)
|
||||
|
||||
|
||||
@ -1107,7 +1107,7 @@ if "bit_depth_policy" in self.merge_rule_widgets and isinstance(self.merge_rule_
|
||||
"output_map_type": "NEW_RULE",
|
||||
"inputs": {"R": "", "G": "", "B": "", "A": ""},
|
||||
"defaults": {"R": 0.0, "G": 0.0, "B": 0.0, "A": 1.0},
|
||||
"bit_depth_policy": "preserve"
|
||||
"output_bit_depth": "respect_inputs"
|
||||
}
|
||||
|
||||
# Add to the internal list that backs the UI
|
||||
@ -1417,9 +1417,8 @@ if "bit_depth_policy" in self.merge_rule_widgets and isinstance(self.merge_rule_
|
||||
self.widgets["RESOLUTION_THRESHOLD_FOR_JPG"].setCurrentText(current_text_selection)
|
||||
|
||||
|
||||
# The MAP_BIT_DEPTH_RULES table is removed as per refactoring plan.
|
||||
# elif key == "MAP_BIT_DEPTH_RULES" and "MAP_BIT_DEPTH_RULES_TABLE" in self.widgets:
|
||||
# self.populate_map_bit_depth_rules_table(self.widgets["MAP_BIT_DEPTH_RULES_TABLE"], value)
|
||||
elif key == "MAP_BIT_DEPTH_RULES" and "MAP_BIT_DEPTH_RULES_TABLE" in self.widgets:
|
||||
self.populate_map_bit_depth_rules_table(self.widgets["MAP_BIT_DEPTH_RULES_TABLE"], value)
|
||||
|
||||
|
||||
elif key == "MAP_MERGE_RULES" and hasattr(self, 'merge_rules_list'): # Check if the list widget exists
|
||||
@ -1493,10 +1492,10 @@ if "bit_depth_policy" in self.merge_rule_widgets and isinstance(self.merge_rule_
|
||||
item_standard_type = QTableWidgetItem(standard_type_str)
|
||||
table.setItem(row, 4, item_standard_type)
|
||||
|
||||
# Bit Depth Policy column (simple QTableWidgetItem for now)
|
||||
bit_depth_policy_str = details.get("bit_depth_policy", "")
|
||||
item_bit_depth_policy = QTableWidgetItem(bit_depth_policy_str)
|
||||
table.setItem(row, 5, item_bit_depth_policy)
|
||||
# Bit Depth Rule column (simple QTableWidgetItem for now)
|
||||
bit_depth_rule_str = details.get("bit_depth_rule", "")
|
||||
item_bit_depth_rule = QTableWidgetItem(bit_depth_rule_str)
|
||||
table.setItem(row, 5, item_bit_depth_rule)
|
||||
|
||||
# Background color is now handled by the delegate's paint method based on data
|
||||
|
||||
@ -1526,15 +1525,14 @@ if "bit_depth_policy" in self.merge_rule_widgets and isinstance(self.merge_rule_
|
||||
row += 1
|
||||
|
||||
|
||||
# The populate_map_bit_depth_rules_table method is removed as per refactoring plan.
|
||||
# def populate_map_bit_depth_rules_table(self, table: QTableWidget, rules_data: dict):
|
||||
# """Populates the map bit depth rules table."""
|
||||
# table.setRowCount(len(rules_data))
|
||||
# row = 0
|
||||
# for map_type, rule in rules_data.items():
|
||||
# table.setItem(row, 0, QTableWidgetItem(map_type))
|
||||
# table.setItem(row, 1, QTableWidgetItem(str(rule))) # Rule (respect/force_8bit)
|
||||
# row += 1
|
||||
def populate_map_bit_depth_rules_table(self, table: QTableWidget, rules_data: dict):
|
||||
"""Populates the map bit depth rules table."""
|
||||
table.setRowCount(len(rules_data))
|
||||
row = 0
|
||||
for map_type, rule in rules_data.items():
|
||||
table.setItem(row, 0, QTableWidgetItem(map_type))
|
||||
table.setItem(row, 1, QTableWidgetItem(str(rule))) # Rule (respect/force_8bit)
|
||||
row += 1
|
||||
|
||||
|
||||
|
||||
|
||||
@ -8,16 +8,38 @@ from PySide6.QtWidgets import (
|
||||
from PySide6.QtGui import QColor, QPalette, QMouseEvent # Added QMouseEvent
|
||||
from PySide6.QtCore import Qt, QEvent
|
||||
|
||||
from PySide6.QtGui import QColor, QPalette, QMouseEvent
|
||||
from PySide6.QtCore import Qt, QEvent
|
||||
|
||||
# Import the Configuration class
|
||||
from configuration import Configuration, ConfigurationError
|
||||
# Assuming load_asset_definitions, load_file_type_definitions, load_supplier_settings
|
||||
# are in configuration.py at the root level.
|
||||
# Adjust the import path if configuration.py is located elsewhere relative to this file.
|
||||
# For example, if configuration.py is in the parent directory:
|
||||
# from ..configuration import load_asset_definitions, load_file_type_definitions, load_supplier_settings
|
||||
# Or if it's in the same directory (less likely for a root config file):
|
||||
# from .configuration import ...
|
||||
# Given the project structure, configuration.py is at the root.
|
||||
import sys
|
||||
import os
|
||||
# Add project root to sys.path to allow direct import of configuration
|
||||
sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), '..')))
|
||||
try:
|
||||
from configuration import (
|
||||
load_asset_definitions, save_asset_definitions,
|
||||
load_file_type_definitions, save_file_type_definitions,
|
||||
load_supplier_settings, save_supplier_settings
|
||||
)
|
||||
except ImportError as e:
|
||||
logging.error(f"Failed to import configuration functions: {e}. Ensure configuration.py is in the project root and accessible.")
|
||||
# Provide dummy functions if import fails, so the UI can still be tested somewhat
|
||||
def load_asset_definitions(): return {}
|
||||
def save_asset_definitions(data): pass
|
||||
def load_file_type_definitions(): return {}
|
||||
def save_file_type_definitions(data): pass
|
||||
def load_supplier_settings(): return {}
|
||||
# def save_supplier_settings(data): pass
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
class DebugListWidget(QListWidget):
|
||||
def mousePressEvent(self, event: QMouseEvent):
|
||||
def mousePressEvent(self, event: QMouseEvent): # QMouseEvent needs to be imported from PySide6.QtGui
|
||||
logger.info(f"DebugListWidget.mousePressEvent: pos={event.pos()}")
|
||||
item = self.itemAt(event.pos())
|
||||
if item:
|
||||
@ -28,9 +50,8 @@ class DebugListWidget(QListWidget):
|
||||
logger.info("DebugListWidget.mousePressEvent: super call finished.")
|
||||
|
||||
class DefinitionsEditorDialog(QDialog):
|
||||
def __init__(self, config: Configuration, parent=None):
|
||||
def __init__(self, parent=None):
|
||||
super().__init__(parent)
|
||||
self.config = config # Store the Configuration object
|
||||
self.setWindowTitle("Definitions Editor")
|
||||
self.setGeometry(200, 200, 800, 600) # x, y, width, height
|
||||
|
||||
@ -567,8 +588,8 @@ class DefinitionsEditorDialog(QDialog):
|
||||
|
||||
# Bit Depth Rule
|
||||
self.ft_bit_depth_combo = QComboBox()
|
||||
self.ft_bit_depth_combo.addItems(["preserve", "force_8bit", "force_16bit"])
|
||||
details_layout.addRow("Bit Depth Policy:", self.ft_bit_depth_combo)
|
||||
self.ft_bit_depth_combo.addItems(["respect", "force_8bit", "force_16bit"])
|
||||
details_layout.addRow("Bit Depth Rule:", self.ft_bit_depth_combo)
|
||||
|
||||
# Is Grayscale
|
||||
self.ft_is_grayscale_check = QCheckBox("Is Grayscale")
|
||||
@ -606,7 +627,7 @@ class DefinitionsEditorDialog(QDialog):
|
||||
logger.warning(f"File type data for '{key}' is not a dict: {ft_data_item}. Using default.")
|
||||
ft_data_item = {
|
||||
"description": str(ft_data_item), "color": "#ffffff", "examples": [],
|
||||
"standard_type": "", "bit_depth_policy": "preserve",
|
||||
"standard_type": "", "bit_depth_rule": "respect",
|
||||
"is_grayscale": False, "keybind": ""
|
||||
}
|
||||
|
||||
@ -615,7 +636,7 @@ class DefinitionsEditorDialog(QDialog):
|
||||
ft_data_item.setdefault('color', '#ffffff')
|
||||
ft_data_item.setdefault('examples', [])
|
||||
ft_data_item.setdefault('standard_type', '')
|
||||
ft_data_item.setdefault('bit_depth_policy', 'preserve')
|
||||
ft_data_item.setdefault('bit_depth_rule', 'respect')
|
||||
ft_data_item.setdefault('is_grayscale', False)
|
||||
ft_data_item.setdefault('keybind', '')
|
||||
|
||||
@ -651,7 +672,7 @@ class DefinitionsEditorDialog(QDialog):
|
||||
logger.error(f"Invalid data for file type item {current_item.text()}. Expected dict, got {type(ft_data)}")
|
||||
ft_data = {
|
||||
"description": "Error: Invalid data", "color": "#ff0000", "examples": [],
|
||||
"standard_type": "error", "bit_depth_policy": "preserve",
|
||||
"standard_type": "error", "bit_depth_rule": "respect",
|
||||
"is_grayscale": False, "keybind": "X"
|
||||
}
|
||||
|
||||
@ -664,11 +685,11 @@ class DefinitionsEditorDialog(QDialog):
|
||||
|
||||
self.ft_standard_type_edit.setText(ft_data.get('standard_type', ''))
|
||||
|
||||
bdr_index = self.ft_bit_depth_combo.findText(ft_data.get('bit_depth_policy', 'preserve'))
|
||||
bdr_index = self.ft_bit_depth_combo.findText(ft_data.get('bit_depth_rule', 'respect'))
|
||||
if bdr_index != -1:
|
||||
self.ft_bit_depth_combo.setCurrentIndex(bdr_index)
|
||||
else:
|
||||
self.ft_bit_depth_combo.setCurrentIndex(0) # Default to 'preserve'
|
||||
self.ft_bit_depth_combo.setCurrentIndex(0) # Default to 'respect'
|
||||
|
||||
self.ft_is_grayscale_check.setChecked(ft_data.get('is_grayscale', False))
|
||||
self.ft_keybind_edit.setText(ft_data.get('keybind', ''))
|
||||
@ -725,7 +746,7 @@ class DefinitionsEditorDialog(QDialog):
|
||||
"color": "#ffffff",
|
||||
"examples": [],
|
||||
"standard_type": "",
|
||||
"bit_depth_policy": "preserve",
|
||||
"bit_depth_rule": "respect",
|
||||
"is_grayscale": False,
|
||||
"keybind": ""
|
||||
}
|
||||
@ -869,7 +890,7 @@ class DefinitionsEditorDialog(QDialog):
|
||||
# Update based on which widget triggered (or update all)
|
||||
ft_data['description'] = self.ft_description_edit.toPlainText()
|
||||
ft_data['standard_type'] = self.ft_standard_type_edit.text()
|
||||
ft_data['bit_depth_policy'] = self.ft_bit_depth_combo.currentText()
|
||||
ft_data['bit_depth_rule'] = self.ft_bit_depth_combo.currentText()
|
||||
ft_data['is_grayscale'] = self.ft_is_grayscale_check.isChecked()
|
||||
|
||||
# Keybind validation (force uppercase)
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
from pathlib import Path
|
||||
from PySide6.QtWidgets import QStyledItemDelegate, QLineEdit, QComboBox
|
||||
from PySide6.QtCore import Qt, QModelIndex
|
||||
from configuration import Configuration, ConfigurationError # Keep load_base_config for SupplierSearchDelegate
|
||||
from configuration import Configuration, ConfigurationError, load_base_config # Keep load_base_config for SupplierSearchDelegate
|
||||
from PySide6.QtWidgets import QListWidgetItem
|
||||
|
||||
import json
|
||||
|
||||
@ -1,388 +0,0 @@
|
||||
import sys
|
||||
import os
|
||||
import shutil
|
||||
import json
|
||||
from pathlib import Path
|
||||
from typing import Optional, Tuple
|
||||
|
||||
from PySide6.QtWidgets import (
|
||||
QDialog, QVBoxLayout, QHBoxLayout, QLabel, QLineEdit, QPushButton,
|
||||
QFileDialog, QMessageBox, QGroupBox, QFormLayout, QSpinBox, QDialogButtonBox
|
||||
)
|
||||
from PySide6.QtCore import Qt, Slot
|
||||
|
||||
# Constants for bundled resource locations relative to app base
|
||||
BUNDLED_CONFIG_SUBDIR_NAME = "config"
|
||||
BUNDLED_PRESETS_SUBDIR_NAME = "Presets"
|
||||
DEFAULT_USER_DATA_SUBDIR_NAME = "user_data" # For portable path attempt
|
||||
|
||||
# Files to copy from bundled config to user config
|
||||
DEFAULT_CONFIG_FILES = [
|
||||
"asset_type_definitions.json",
|
||||
"file_type_definitions.json",
|
||||
"llm_settings.json",
|
||||
"suppliers.json"
|
||||
]
|
||||
# app_settings.json is NOT copied. user_settings.json is handled separately.
|
||||
|
||||
USER_SETTINGS_FILENAME = "user_settings.json"
|
||||
PERSISTENT_PATH_MARKER_FILENAME = ".first_run_complete"
|
||||
PERSISTENT_CONFIG_ROOT_STORAGE_FILENAME = "asset_processor_user_root.txt" # Stores USER_CHOSEN_PATH
|
||||
|
||||
APP_NAME = "AssetProcessor" # Used for AppData paths
|
||||
|
||||
def get_app_base_dir() -> Path:
|
||||
"""Determines the base directory for the application (executable or script)."""
|
||||
if getattr(sys, 'frozen', False) and hasattr(sys, '_MEIPASS'):
|
||||
# Running in a PyInstaller bundle
|
||||
return Path(sys._MEIPASS)
|
||||
else:
|
||||
# Running as a script
|
||||
return Path(__file__).resolve().parent.parent # Assuming this file is in gui/ subdir
|
||||
|
||||
def get_os_specific_app_data_dir() -> Path:
|
||||
"""Gets the OS-specific application data directory."""
|
||||
if sys.platform == "win32":
|
||||
path_str = os.getenv('APPDATA')
|
||||
if path_str:
|
||||
return Path(path_str) / APP_NAME
|
||||
# Fallback if APPDATA is not set, though unlikely
|
||||
return Path.home() / "AppData" / "Roaming" / APP_NAME
|
||||
elif sys.platform == "darwin": # macOS
|
||||
return Path.home() / "Library" / "Application Support" / APP_NAME
|
||||
else: # Linux and other Unix-like
|
||||
return Path.home() / ".config" / APP_NAME
|
||||
|
||||
class FirstTimeSetupDialog(QDialog):
|
||||
def __init__(self, parent=None):
|
||||
super().__init__(parent)
|
||||
self.setWindowTitle("Asset Processor - First-Time Setup")
|
||||
self.setModal(True)
|
||||
self.setMinimumWidth(600)
|
||||
|
||||
self.app_base_dir = get_app_base_dir()
|
||||
self.user_chosen_path: Optional[Path] = None
|
||||
|
||||
self._init_ui()
|
||||
self._propose_default_config_path()
|
||||
|
||||
def _init_ui(self):
|
||||
main_layout = QVBoxLayout(self)
|
||||
|
||||
# Configuration Path Group
|
||||
config_path_group = QGroupBox("Configuration Location")
|
||||
config_path_layout = QVBoxLayout()
|
||||
|
||||
self.proposed_path_label = QLabel("Proposed default configuration path:")
|
||||
config_path_layout.addWidget(self.proposed_path_label)
|
||||
|
||||
path_selection_layout = QHBoxLayout()
|
||||
self.config_path_edit = QLineEdit()
|
||||
self.config_path_edit.setReadOnly(False) # Allow editing, then validate
|
||||
path_selection_layout.addWidget(self.config_path_edit)
|
||||
|
||||
browse_button = QPushButton("Browse...")
|
||||
browse_button.clicked.connect(self._browse_config_path)
|
||||
path_selection_layout.addWidget(browse_button)
|
||||
config_path_layout.addLayout(path_selection_layout)
|
||||
config_path_group.setLayout(config_path_layout)
|
||||
main_layout.addWidget(config_path_group)
|
||||
|
||||
# User Settings Group
|
||||
user_settings_group = QGroupBox("Initial User Settings")
|
||||
user_settings_form_layout = QFormLayout()
|
||||
|
||||
self.output_base_dir_edit = QLineEdit()
|
||||
output_base_dir_browse_button = QPushButton("Browse...")
|
||||
output_base_dir_browse_button.clicked.connect(self._browse_output_base_dir)
|
||||
output_base_dir_layout = QHBoxLayout()
|
||||
output_base_dir_layout.addWidget(self.output_base_dir_edit)
|
||||
output_base_dir_layout.addWidget(output_base_dir_browse_button)
|
||||
user_settings_form_layout.addRow("Default Library Output Path:", output_base_dir_layout)
|
||||
|
||||
self.output_dir_pattern_edit = QLineEdit("[supplier]/[asset_category]/[asset_name]")
|
||||
user_settings_form_layout.addRow("Asset Structure Pattern:", self.output_dir_pattern_edit)
|
||||
|
||||
self.output_format_16bit_primary_edit = QLineEdit("png")
|
||||
user_settings_form_layout.addRow("Default 16-bit Output Format (Primary):", self.output_format_16bit_primary_edit)
|
||||
|
||||
self.output_format_8bit_edit = QLineEdit("png")
|
||||
user_settings_form_layout.addRow("Default 8-bit Output Format:", self.output_format_8bit_edit)
|
||||
|
||||
self.resolution_threshold_jpg_spinbox = QSpinBox()
|
||||
self.resolution_threshold_jpg_spinbox.setRange(256, 16384)
|
||||
self.resolution_threshold_jpg_spinbox.setValue(4096)
|
||||
self.resolution_threshold_jpg_spinbox.setSuffix(" px")
|
||||
user_settings_form_layout.addRow("JPG Resolution Threshold (for 8-bit):", self.resolution_threshold_jpg_spinbox)
|
||||
|
||||
user_settings_group.setLayout(user_settings_form_layout)
|
||||
main_layout.addWidget(user_settings_group)
|
||||
|
||||
# Dialog Buttons
|
||||
self.button_box = QDialogButtonBox(QDialogButtonBox.StandardButton.Ok | QDialogButtonBox.StandardButton.Cancel)
|
||||
self.button_box.button(QDialogButtonBox.StandardButton.Ok).setText("Finish Setup")
|
||||
self.button_box.accepted.connect(self._on_finish_setup)
|
||||
self.button_box.rejected.connect(self.reject)
|
||||
main_layout.addWidget(self.button_box)
|
||||
|
||||
def _propose_default_config_path(self):
|
||||
proposed_path = None
|
||||
|
||||
# 1. Try portable path: user_data/ next to the application base dir
|
||||
# If running from script, app_base_dir is .../Asset_processor_tool/gui, so parent is .../Asset_processor_tool
|
||||
# If bundled, app_base_dir is the directory of the executable.
|
||||
|
||||
# Let's refine app_base_dir for portable path logic
|
||||
# If script: Path(__file__).parent.parent = Asset_processor_tool
|
||||
# If frozen: sys._MEIPASS (which is the temp extraction dir, not ideal for persistent user_data)
|
||||
# A better approach for portable if frozen: Path(sys.executable).parent
|
||||
|
||||
current_app_dir = Path(sys.executable).parent if getattr(sys, 'frozen', False) else self.app_base_dir
|
||||
|
||||
portable_path_candidate = current_app_dir / DEFAULT_USER_DATA_SUBDIR_NAME
|
||||
try:
|
||||
portable_path_candidate.mkdir(parents=True, exist_ok=True)
|
||||
if os.access(str(portable_path_candidate), os.W_OK):
|
||||
proposed_path = portable_path_candidate
|
||||
self.proposed_path_label.setText(f"Proposed portable path (writable):")
|
||||
else:
|
||||
self.proposed_path_label.setText(f"Portable path '{portable_path_candidate}' not writable.")
|
||||
except Exception as e:
|
||||
self.proposed_path_label.setText(f"Could not use portable path '{portable_path_candidate}': {e}")
|
||||
print(f"Error checking/creating portable path: {e}") # For debugging
|
||||
|
||||
# 2. Fallback to OS-specific app data directory
|
||||
if not proposed_path:
|
||||
os_specific_path = get_os_specific_app_data_dir()
|
||||
try:
|
||||
os_specific_path.mkdir(parents=True, exist_ok=True)
|
||||
if os.access(str(os_specific_path), os.W_OK):
|
||||
proposed_path = os_specific_path
|
||||
self.proposed_path_label.setText(f"Proposed standard path (writable):")
|
||||
else:
|
||||
self.proposed_path_label.setText(f"Standard path '{os_specific_path}' not writable. Please choose a location.")
|
||||
except Exception as e:
|
||||
self.proposed_path_label.setText(f"Could not use standard path '{os_specific_path}': {e}. Please choose a location.")
|
||||
print(f"Error checking/creating standard path: {e}") # For debugging
|
||||
|
||||
if proposed_path:
|
||||
self.config_path_edit.setText(str(proposed_path.resolve()))
|
||||
else:
|
||||
# Should not happen if OS specific path creation works, but as a last resort:
|
||||
self.config_path_edit.setText(str(Path.home())) # Default to home if all else fails
|
||||
QMessageBox.warning(self, "Path Issue", "Could not determine a default writable configuration path. Please select one manually.")
|
||||
|
||||
@Slot()
|
||||
def _browse_config_path(self):
|
||||
directory = QFileDialog.getExistingDirectory(
|
||||
self,
|
||||
"Select Configuration Directory",
|
||||
self.config_path_edit.text() or str(Path.home())
|
||||
)
|
||||
if directory:
|
||||
self.config_path_edit.setText(directory)
|
||||
|
||||
@Slot()
|
||||
def _browse_output_base_dir(self):
|
||||
directory = QFileDialog.getExistingDirectory(
|
||||
self,
|
||||
"Select Default Library Output Directory",
|
||||
self.output_base_dir_edit.text() or str(Path.home())
|
||||
)
|
||||
if directory:
|
||||
self.output_base_dir_edit.setText(directory)
|
||||
|
||||
def _validate_inputs(self) -> bool:
|
||||
# Validate chosen config path
|
||||
path_str = self.config_path_edit.text().strip()
|
||||
if not path_str:
|
||||
QMessageBox.warning(self, "Input Error", "Configuration path cannot be empty.")
|
||||
return False
|
||||
|
||||
self.user_chosen_path = Path(path_str)
|
||||
try:
|
||||
self.user_chosen_path.mkdir(parents=True, exist_ok=True)
|
||||
if not os.access(str(self.user_chosen_path), os.W_OK):
|
||||
QMessageBox.warning(self, "Path Error", f"The chosen configuration path '{self.user_chosen_path}' is not writable.")
|
||||
return False
|
||||
except Exception as e:
|
||||
QMessageBox.warning(self, "Path Error", f"Error with chosen configuration path '{self.user_chosen_path}': {e}")
|
||||
return False
|
||||
|
||||
# Validate output base dir
|
||||
output_base_dir_str = self.output_base_dir_edit.text().strip()
|
||||
if not output_base_dir_str:
|
||||
QMessageBox.warning(self, "Input Error", "Default Library Output Path cannot be empty.")
|
||||
return False
|
||||
try:
|
||||
Path(output_base_dir_str).mkdir(parents=True, exist_ok=True) # Check if creatable
|
||||
if not os.access(output_base_dir_str, os.W_OK):
|
||||
QMessageBox.warning(self, "Path Error", f"The chosen output base path '{output_base_dir_str}' is not writable.")
|
||||
return False
|
||||
except Exception as e:
|
||||
QMessageBox.warning(self, "Path Error", f"Error with output base path '{output_base_dir_str}': {e}")
|
||||
return False
|
||||
|
||||
if not self.output_dir_pattern_edit.text().strip():
|
||||
QMessageBox.warning(self, "Input Error", "Asset Structure Pattern cannot be empty.")
|
||||
return False
|
||||
if not self.output_format_16bit_primary_edit.text().strip():
|
||||
QMessageBox.warning(self, "Input Error", "Default 16-bit Output Format cannot be empty.")
|
||||
return False
|
||||
if not self.output_format_8bit_edit.text().strip():
|
||||
QMessageBox.warning(self, "Input Error", "Default 8-bit Output Format cannot be empty.")
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def _copy_default_files(self):
|
||||
if not self.user_chosen_path:
|
||||
return
|
||||
|
||||
bundled_config_dir = self.app_base_dir / BUNDLED_CONFIG_SUBDIR_NAME
|
||||
user_target_config_dir = self.user_chosen_path / BUNDLED_CONFIG_SUBDIR_NAME # User files also go into a 'config' subdir
|
||||
|
||||
try:
|
||||
user_target_config_dir.mkdir(parents=True, exist_ok=True)
|
||||
except Exception as e:
|
||||
QMessageBox.critical(self, "Error", f"Could not create user config subdirectory '{user_target_config_dir}': {e}")
|
||||
return
|
||||
|
||||
for filename in DEFAULT_CONFIG_FILES:
|
||||
source_file = bundled_config_dir / filename
|
||||
target_file = user_target_config_dir / filename
|
||||
if not target_file.exists():
|
||||
if source_file.is_file():
|
||||
try:
|
||||
shutil.copy2(str(source_file), str(target_file))
|
||||
print(f"Copied '{source_file}' to '{target_file}'")
|
||||
except Exception as e:
|
||||
QMessageBox.warning(self, "File Copy Error", f"Could not copy '{filename}' to '{target_file}': {e}")
|
||||
else:
|
||||
print(f"Default config file '{source_file}' not found in bundle.")
|
||||
else:
|
||||
print(f"User config file '{target_file}' already exists. Skipping copy.")
|
||||
|
||||
# Copy Presets
|
||||
bundled_presets_dir = self.app_base_dir / BUNDLED_PRESETS_SUBDIR_NAME
|
||||
user_target_presets_dir = self.user_chosen_path / BUNDLED_PRESETS_SUBDIR_NAME
|
||||
|
||||
if bundled_presets_dir.is_dir():
|
||||
try:
|
||||
user_target_presets_dir.mkdir(parents=True, exist_ok=True)
|
||||
for item in bundled_presets_dir.iterdir():
|
||||
target_item = user_target_presets_dir / item.name
|
||||
if not target_item.exists():
|
||||
if item.is_file():
|
||||
shutil.copy2(str(item), str(target_item))
|
||||
print(f"Copied preset '{item.name}' to '{target_item}'")
|
||||
# Add elif item.is_dir() for recursive copy if presets can have subdirs
|
||||
except Exception as e:
|
||||
QMessageBox.warning(self, "Preset Copy Error", f"Could not copy presets to '{user_target_presets_dir}': {e}")
|
||||
else:
|
||||
print(f"Bundled presets directory '{bundled_presets_dir}' not found.")
|
||||
|
||||
|
||||
def _save_initial_user_settings(self):
|
||||
if not self.user_chosen_path:
|
||||
return
|
||||
|
||||
user_settings_path = self.user_chosen_path / USER_SETTINGS_FILENAME
|
||||
settings_data = {}
|
||||
|
||||
# Load existing if it exists (though unlikely for first-time setup, but good practice)
|
||||
if user_settings_path.exists():
|
||||
try:
|
||||
with open(user_settings_path, 'r', encoding='utf-8') as f:
|
||||
settings_data = json.load(f)
|
||||
except Exception as e:
|
||||
QMessageBox.warning(self, "Error Loading Settings", f"Could not load existing user settings from '{user_settings_path}': {e}. Will create a new one.")
|
||||
settings_data = {}
|
||||
|
||||
# Update with new values from dialog
|
||||
settings_data['OUTPUT_BASE_DIR'] = self.output_base_dir_edit.text().strip()
|
||||
settings_data['OUTPUT_DIRECTORY_PATTERN'] = self.output_dir_pattern_edit.text().strip()
|
||||
settings_data['OUTPUT_FORMAT_16BIT_PRIMARY'] = self.output_format_16bit_primary_edit.text().strip().lower()
|
||||
settings_data['OUTPUT_FORMAT_8BIT'] = self.output_format_8bit_edit.text().strip().lower()
|
||||
settings_data['RESOLUTION_THRESHOLD_FOR_JPG'] = self.resolution_threshold_jpg_spinbox.value()
|
||||
|
||||
# Ensure general_settings exists for app_version if needed, or other core settings
|
||||
if 'general_settings' not in settings_data:
|
||||
settings_data['general_settings'] = {}
|
||||
# Example: settings_data['general_settings']['some_new_user_setting'] = True
|
||||
|
||||
try:
|
||||
with open(user_settings_path, 'w', encoding='utf-8') as f:
|
||||
json.dump(settings_data, f, indent=4)
|
||||
print(f"Saved user settings to '{user_settings_path}'")
|
||||
except Exception as e:
|
||||
QMessageBox.critical(self, "Error Saving Settings", f"Could not save user settings to '{user_settings_path}': {e}")
|
||||
|
||||
|
||||
def _save_persistent_info(self):
|
||||
if not self.user_chosen_path:
|
||||
return
|
||||
|
||||
# 1. Save USER_CHOSEN_PATH to a persistent location (e.g., AppData)
|
||||
persistent_storage_dir = get_os_specific_app_data_dir()
|
||||
try:
|
||||
persistent_storage_dir.mkdir(parents=True, exist_ok=True)
|
||||
persistent_path_file = persistent_storage_dir / PERSISTENT_CONFIG_ROOT_STORAGE_FILENAME
|
||||
with open(persistent_path_file, 'w', encoding='utf-8') as f:
|
||||
f.write(str(self.user_chosen_path.resolve()))
|
||||
print(f"Saved chosen config path to '{persistent_path_file}'")
|
||||
except Exception as e:
|
||||
QMessageBox.warning(self, "Error Saving Path", f"Could not persistently save the chosen configuration path: {e}")
|
||||
# This is not critical enough to stop the setup, but user might need to re-select on next launch.
|
||||
|
||||
# 2. Create marker file in USER_CHOSEN_PATH
|
||||
marker_file = self.user_chosen_path / PERSISTENT_PATH_MARKER_FILENAME
|
||||
try:
|
||||
with open(marker_file, 'w', encoding='utf-8') as f:
|
||||
f.write("Asset Processor first-time setup complete.")
|
||||
print(f"Created marker file at '{marker_file}'")
|
||||
except Exception as e:
|
||||
QMessageBox.warning(self, "Error Creating Marker", f"Could not create first-run marker file at '{marker_file}': {e}")
|
||||
|
||||
@Slot()
|
||||
def _on_finish_setup(self):
|
||||
if not self._validate_inputs():
|
||||
return
|
||||
|
||||
# Confirmation before proceeding
|
||||
reply = QMessageBox.question(self, "Confirm Setup",
|
||||
f"The following path will be used for configuration and user data:\n"
|
||||
f"{self.user_chosen_path}\n\n"
|
||||
f"Default configuration files and presets will be copied if they don't exist.\n"
|
||||
f"Initial user settings will be saved.\n\nProceed with setup?",
|
||||
QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No,
|
||||
QMessageBox.StandardButton.No)
|
||||
if reply == QMessageBox.StandardButton.No:
|
||||
return
|
||||
|
||||
try:
|
||||
self._copy_default_files()
|
||||
self._save_initial_user_settings()
|
||||
self._save_persistent_info()
|
||||
QMessageBox.information(self, "Setup Complete", "First-time setup completed successfully!")
|
||||
self.accept()
|
||||
except Exception as e:
|
||||
QMessageBox.critical(self, "Setup Error", f"An unexpected error occurred during setup: {e}")
|
||||
# Optionally, attempt cleanup or guide user
|
||||
|
||||
def get_chosen_config_path(self) -> Optional[Path]:
|
||||
"""Returns the path chosen by the user after successful completion."""
|
||||
if self.result() == QDialog.DialogCode.Accepted:
|
||||
return self.user_chosen_path
|
||||
return None
|
||||
|
||||
if __name__ == '__main__':
|
||||
from PySide6.QtWidgets import QApplication
|
||||
app = QApplication(sys.argv)
|
||||
dialog = FirstTimeSetupDialog()
|
||||
if dialog.exec():
|
||||
chosen_path = dialog.get_chosen_config_path()
|
||||
print(f"Dialog accepted. Chosen config path: {chosen_path}")
|
||||
else:
|
||||
print("Dialog cancelled.")
|
||||
sys.exit()
|
||||
@ -10,7 +10,7 @@ from PySide6.QtWidgets import (
|
||||
from PySide6.QtCore import Slot as pyqtSlot, Signal as pyqtSignal # Use PySide6 equivalents
|
||||
|
||||
# Assuming configuration module exists and has relevant functions later
|
||||
from configuration import ConfigurationError
|
||||
from configuration import save_llm_config, ConfigurationError
|
||||
# For now, define path directly for initial structure
|
||||
LLM_CONFIG_PATH = "config/llm_settings.json"
|
||||
|
||||
@ -280,13 +280,7 @@ class LLMEditorWidget(QWidget):
|
||||
|
||||
# 1.d. Save Updated Content
|
||||
try:
|
||||
# Ensure the directory exists before saving
|
||||
import os
|
||||
os.makedirs(os.path.dirname(LLM_CONFIG_PATH), exist_ok=True)
|
||||
|
||||
with open(LLM_CONFIG_PATH, 'w', encoding='utf-8') as f:
|
||||
json.dump(target_file_content, f, indent=4)
|
||||
|
||||
save_llm_config(target_file_content) # Save the potentially modified target_file_content
|
||||
QMessageBox.information(self, "Save Successful", f"LLM settings saved to:\n{LLM_CONFIG_PATH}")
|
||||
|
||||
# Update original_llm_settings to reflect the newly saved state
|
||||
@ -297,9 +291,9 @@ class LLMEditorWidget(QWidget):
|
||||
self.settings_saved.emit()
|
||||
logger.info("LLM settings saved successfully.")
|
||||
|
||||
except (IOError, OSError) as e:
|
||||
logger.error(f"Failed to write LLM settings file {LLM_CONFIG_PATH}: {e}")
|
||||
QMessageBox.critical(self, "Save Error", f"Could not write LLM settings file.\n\nError: {e}")
|
||||
except ConfigurationError as e:
|
||||
logger.error(f"Failed to save LLM settings: {e}")
|
||||
QMessageBox.critical(self, "Save Error", f"Could not save LLM settings.\n\nError: {e}")
|
||||
self.save_button.setEnabled(True) # Keep save enabled
|
||||
self._unsaved_changes = True
|
||||
except Exception as e:
|
||||
|
||||
@ -24,9 +24,6 @@ class LLMPredictionHandler(BasePredictionHandler):
|
||||
Handles the interaction with an LLM for predicting asset structures
|
||||
based on a directory's file list. Inherits from BasePredictionHandler.
|
||||
"""
|
||||
# Define a constant for files not classified by the LLM
|
||||
FILE_UNCLASSIFIED_BY_LLM = "FILE_UNCLASSIFIED_BY_LLM"
|
||||
|
||||
# Signals (prediction_ready, prediction_error, status_update) are inherited
|
||||
|
||||
# Changed 'config: Configuration' to 'settings: dict'
|
||||
@ -310,67 +307,54 @@ class LLMPredictionHandler(BasePredictionHandler):
|
||||
valid_file_types = list(self.settings.get('file_type_definitions', {}).keys())
|
||||
asset_rules_map: Dict[str, AssetRule] = {} # Maps group_name to AssetRule
|
||||
|
||||
# --- Map LLM File Analysis for Quick Lookup ---
|
||||
llm_file_map: Dict[str, Dict[str, Any]] = {}
|
||||
for file_data in response_data.get("individual_file_analysis", []):
|
||||
if isinstance(file_data, dict):
|
||||
file_path_rel = file_data.get("relative_file_path")
|
||||
if file_path_rel and isinstance(file_path_rel, str):
|
||||
llm_file_map[file_path_rel] = file_data
|
||||
else:
|
||||
log.warning(f"Skipping LLM file data entry with missing or invalid 'relative_file_path': {file_data}")
|
||||
else:
|
||||
log.warning(f"Skipping invalid LLM file data entry (not a dict): {file_data}")
|
||||
|
||||
# --- Process Actual Input Files and Reconcile with LLM Data ---
|
||||
for file_path_rel in self.file_list:
|
||||
# --- Process Individual Files and Build Rules ---
|
||||
for file_data in response_data["individual_file_analysis"]:
|
||||
# Check for cancellation within the loop
|
||||
if self._is_cancelled:
|
||||
log.info("LLM prediction cancelled during response parsing (files).")
|
||||
return []
|
||||
|
||||
file_data = llm_file_map.pop(file_path_rel, None) # Get data if exists, remove from map
|
||||
if not isinstance(file_data, dict):
|
||||
log.warning(f"Skipping invalid file data entry (not a dict): {file_data}")
|
||||
continue
|
||||
|
||||
if file_data:
|
||||
# --- File found in LLM output - Use LLM Classification ---
|
||||
file_path_rel = file_data.get("relative_file_path")
|
||||
file_type = file_data.get("classified_file_type")
|
||||
group_name = file_data.get("proposed_asset_group_name") # Can be string or null
|
||||
|
||||
# Validate file_type against definitions, unless it's FILE_IGNORE
|
||||
if not file_type or not isinstance(file_type, str):
|
||||
log.warning(f"Missing or invalid 'classified_file_type' for file '{file_path_rel}' from LLM. Defaulting to {self.FILE_UNCLASSIFIED_BY_LLM}.")
|
||||
file_type = self.FILE_UNCLASSIFIED_BY_LLM
|
||||
elif file_type != "FILE_IGNORE" and file_type not in valid_file_types:
|
||||
log.warning(f"Invalid predicted_file_type '{file_type}' for file '{file_path_rel}' from LLM. Defaulting to EXTRA.")
|
||||
file_type = "EXTRA"
|
||||
|
||||
# Handle FILE_IGNORE explicitly - do not create a rule for it
|
||||
if file_type == "FILE_IGNORE":
|
||||
log.debug(f"Ignoring file as per LLM prediction: {file_path_rel}")
|
||||
# --- Validate File Data ---
|
||||
if not file_path_rel or not isinstance(file_path_rel, str):
|
||||
log.warning(f"Missing or invalid 'relative_file_path' in file data: {file_data}. Skipping file.")
|
||||
continue
|
||||
|
||||
# Determine group name and asset type
|
||||
if not group_name or not isinstance(group_name, str):
|
||||
log.warning(f"File '{file_path_rel}' has missing, null, or invalid 'proposed_asset_group_name' ({group_name}) from LLM. Assigning to default asset.")
|
||||
group_name = "Unclassified Files" # Default group name
|
||||
asset_type = "UtilityMap" # Default asset type for unclassified files (or another sensible default)
|
||||
else:
|
||||
asset_type = response_data["asset_group_classifications"].get(group_name)
|
||||
if not asset_type:
|
||||
log.warning(f"No classification found in 'asset_group_classifications' for group '{group_name}' (proposed for file '{file_path_rel}'). Assigning to default asset.")
|
||||
group_name = "Unclassified Files" # Default group name
|
||||
asset_type = "UtilityMap" # Default asset type
|
||||
elif asset_type not in valid_asset_types:
|
||||
log.warning(f"Invalid asset_type '{asset_type}' found in 'asset_group_classifications' for group '{group_name}'. Assigning to default asset.")
|
||||
group_name = "Unclassified Files" # Default group name
|
||||
asset_type = "UtilityMap" # Default asset type
|
||||
if not file_type or not isinstance(file_type, str):
|
||||
log.warning(f"Missing or invalid 'classified_file_type' for file '{file_path_rel}'. Skipping file.")
|
||||
continue
|
||||
|
||||
else:
|
||||
# --- File NOT found in LLM output - Assign Default Classification ---
|
||||
log.warning(f"File '{file_path_rel}' from input list was NOT classified by LLM. Assigning type {self.FILE_UNCLASSIFIED_BY_LLM} and default asset.")
|
||||
file_type = self.FILE_UNCLASSIFIED_BY_LLM
|
||||
group_name = "Unclassified Files" # Default group name
|
||||
asset_type = "UtilityMap" # Default asset type
|
||||
# Handle FILE_IGNORE explicitly
|
||||
if file_type == "FILE_IGNORE":
|
||||
log.debug(f"Ignoring file as per LLM prediction: {file_path_rel}")
|
||||
continue # Skip creating a rule for this file
|
||||
|
||||
# Validate file_type against definitions
|
||||
if file_type not in valid_file_types:
|
||||
log.warning(f"Invalid predicted_file_type '{file_type}' for file '{file_path_rel}'. Defaulting to EXTRA.")
|
||||
file_type = "EXTRA"
|
||||
|
||||
# --- Handle Grouping and Asset Type ---
|
||||
if not group_name or not isinstance(group_name, str):
|
||||
log.warning(f"File '{file_path_rel}' has missing, null, or invalid 'proposed_asset_group_name' ({group_name}). Cannot assign to an asset. Skipping file.")
|
||||
continue
|
||||
|
||||
asset_type = response_data["asset_group_classifications"].get(group_name)
|
||||
|
||||
if not asset_type:
|
||||
log.warning(f"No classification found in 'asset_group_classifications' for group '{group_name}' (proposed for file '{file_path_rel}'). Skipping file.")
|
||||
continue
|
||||
|
||||
if asset_type not in valid_asset_types:
|
||||
log.warning(f"Invalid asset_type '{asset_type}' found in 'asset_group_classifications' for group '{group_name}'. Skipping file '{file_path_rel}'.")
|
||||
continue
|
||||
|
||||
# --- Construct Absolute Path ---
|
||||
try:
|
||||
@ -389,34 +373,25 @@ class LLMPredictionHandler(BasePredictionHandler):
|
||||
# Create new AssetRule if this is the first file for this group
|
||||
log.debug(f"Creating new AssetRule for group '{group_name}' with type '{asset_type}'.")
|
||||
asset_rule = AssetRule(asset_name=group_name, asset_type=asset_type)
|
||||
asset_rule.parent_source = source_rule # Set parent back-reference
|
||||
source_rule.assets.append(asset_rule)
|
||||
asset_rules_map[group_name] = asset_rule
|
||||
# If asset_rule already exists, ensure its type is consistent or handle conflicts if necessary.
|
||||
# For now, we'll assume the first file dictates the asset type for the default group.
|
||||
# For LLM-classified groups, the type comes from asset_group_classifications.
|
||||
|
||||
# --- Create and Add File Rule ---
|
||||
file_rule = FileRule(
|
||||
file_path=file_path_abs,
|
||||
item_type=file_type,
|
||||
item_type_override=file_type, # Initial override based on classification (LLM or default)
|
||||
item_type_override=file_type, # Initial override based on LLM
|
||||
target_asset_name_override=group_name,
|
||||
output_format_override=None,
|
||||
resolution_override=None,
|
||||
channel_merge_instructions={}
|
||||
)
|
||||
file_rule.parent_asset = asset_rule # Set parent back-reference
|
||||
asset_rule.files.append(file_rule)
|
||||
log.debug(f"Added file '{file_path_rel}' (type: {file_type}) to asset '{group_name}'.")
|
||||
|
||||
# --- Handle LLM Hallucinations (Remaining entries in llm_file_map) ---
|
||||
for file_path_rel, file_data in llm_file_map.items():
|
||||
log.warning(f"LLM predicted file '{file_path_rel}' which was NOT in the actual input file list. Ignoring this hallucinated entry.")
|
||||
# No FileRule is created for this hallucinated file.
|
||||
|
||||
# Log if no assets were created
|
||||
if not source_rule.assets:
|
||||
log.warning(f"LLM prediction for '{self.input_source_identifier}' resulted in zero valid assets after processing actual file list.")
|
||||
log.warning(f"LLM prediction for '{self.input_source_identifier}' resulted in zero valid assets after parsing.")
|
||||
|
||||
return [source_rule] # Return list containing the single SourceRule
|
||||
|
||||
@ -23,8 +23,15 @@ from .unified_view_model import UnifiedViewModel
|
||||
|
||||
from rule_structure import SourceRule, AssetRule, FileRule
|
||||
import configuration
|
||||
try:
|
||||
from configuration import ConfigurationError, load_base_config
|
||||
except ImportError:
|
||||
ConfigurationError = Exception
|
||||
load_base_config = None
|
||||
class configuration:
|
||||
PRESETS_DIR = "Presets"
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
from configuration import Configuration, ConfigurationError # Import Configuration class and Error
|
||||
|
||||
class MainPanelWidget(QWidget):
|
||||
"""
|
||||
@ -50,7 +57,7 @@ class MainPanelWidget(QWidget):
|
||||
|
||||
blender_settings_changed = Signal(bool, str, str)
|
||||
|
||||
def __init__(self, config: Configuration, unified_model: UnifiedViewModel, parent=None, file_type_keys: list[str] | None = None):
|
||||
def __init__(self, unified_model: UnifiedViewModel, parent=None, file_type_keys: list[str] | None = None):
|
||||
"""
|
||||
Initializes the MainPanelWidget.
|
||||
|
||||
@ -60,7 +67,6 @@ class MainPanelWidget(QWidget):
|
||||
file_type_keys: A list of available file type names (keys from FILE_TYPE_DEFINITIONS).
|
||||
"""
|
||||
super().__init__(parent)
|
||||
self._config = config # Store the Configuration object
|
||||
self.unified_model = unified_model
|
||||
self.file_type_keys = file_type_keys if file_type_keys else []
|
||||
self.llm_processing_active = False
|
||||
@ -85,19 +91,21 @@ class MainPanelWidget(QWidget):
|
||||
output_layout.addWidget(self.browse_output_button)
|
||||
main_layout.addLayout(output_layout)
|
||||
|
||||
if load_base_config:
|
||||
try:
|
||||
# Access configuration directly from the stored object
|
||||
# Use the output_directory_pattern from the Configuration object
|
||||
output_pattern = self._config.output_directory_pattern
|
||||
# Assuming the pattern is relative to the project root for the default
|
||||
default_output_dir = (self.project_root / output_pattern).resolve()
|
||||
base_config = load_base_config()
|
||||
output_base_dir_config = base_config.get('OUTPUT_BASE_DIR', '../Asset_Processor_Output')
|
||||
default_output_dir = (self.project_root / output_base_dir_config).resolve()
|
||||
self.output_path_edit.setText(str(default_output_dir))
|
||||
log.info(f"MainPanelWidget: Default output directory set to: {default_output_dir} based on pattern '{output_pattern}'")
|
||||
log.info(f"MainPanelWidget: Default output directory set to: {default_output_dir}")
|
||||
except ConfigurationError as e:
|
||||
log.error(f"MainPanelWidget: Configuration Error setting default output directory: {e}")
|
||||
log.error(f"MainPanelWidget: Error reading base configuration for default output directory: {e}")
|
||||
self.output_path_edit.setText("")
|
||||
except Exception as e:
|
||||
log.exception(f"MainPanelWidget: Unexpected Error setting default output directory: {e}")
|
||||
log.exception(f"MainPanelWidget: Error setting default output directory: {e}")
|
||||
self.output_path_edit.setText("")
|
||||
else:
|
||||
log.warning("MainPanelWidget: load_base_config not available to set default output path.")
|
||||
self.output_path_edit.setText("")
|
||||
|
||||
|
||||
@ -172,14 +180,19 @@ class MainPanelWidget(QWidget):
|
||||
materials_layout.addWidget(self.browse_materials_blend_button)
|
||||
blender_layout.addLayout(materials_layout)
|
||||
|
||||
if load_base_config:
|
||||
try:
|
||||
# Use hardcoded defaults as Configuration object does not expose these via public interface
|
||||
default_ng_path = ''
|
||||
default_mat_path = ''
|
||||
base_config = load_base_config()
|
||||
default_ng_path = base_config.get('DEFAULT_NODEGROUP_BLEND_PATH', '')
|
||||
default_mat_path = base_config.get('DEFAULT_MATERIALS_BLEND_PATH', '')
|
||||
self.nodegroup_blend_path_input.setText(default_ng_path if default_ng_path else "")
|
||||
self.materials_blend_path_input.setText(default_mat_path if default_mat_path else "")
|
||||
except ConfigurationError as e:
|
||||
log.error(f"MainPanelWidget: Error reading base configuration for default Blender paths: {e}")
|
||||
except Exception as e:
|
||||
log.error(f"MainPanelWidget: Error setting default Blender paths: {e}")
|
||||
log.error(f"MainPanelWidget: Error reading default Blender paths from config: {e}")
|
||||
else:
|
||||
log.warning("MainPanelWidget: load_base_config not available to set default Blender paths.")
|
||||
|
||||
|
||||
self.nodegroup_blend_path_input.setEnabled(False)
|
||||
|
||||
@ -46,13 +46,14 @@ if str(project_root) not in sys.path:
|
||||
sys.path.insert(0, str(project_root))
|
||||
|
||||
try:
|
||||
from configuration import Configuration, ConfigurationError
|
||||
from configuration import Configuration, ConfigurationError, load_base_config
|
||||
|
||||
|
||||
except ImportError as e:
|
||||
print(f"ERROR: Failed to import backend modules: {e}")
|
||||
print(f"Ensure GUI is run from project root or backend modules are in PYTHONPATH.")
|
||||
Configuration = None
|
||||
load_base_config = None
|
||||
ConfigurationError = Exception
|
||||
AssetProcessor = None
|
||||
RuleBasedPredictionHandler = None
|
||||
@ -96,9 +97,8 @@ class MainWindow(QMainWindow):
|
||||
start_prediction_signal = Signal(str, list, str)
|
||||
start_backend_processing = Signal(list, dict)
|
||||
|
||||
def __init__(self, config: Configuration):
|
||||
def __init__(self):
|
||||
super().__init__()
|
||||
self.config = config # Store the Configuration object
|
||||
|
||||
self.setWindowTitle("Asset Processor Tool")
|
||||
self.resize(1200, 700)
|
||||
@ -132,7 +132,7 @@ class MainWindow(QMainWindow):
|
||||
self.setCentralWidget(self.splitter)
|
||||
|
||||
# --- Create Models ---
|
||||
self.unified_model = UnifiedViewModel(config=self.config)
|
||||
self.unified_model = UnifiedViewModel()
|
||||
# --- Instantiate Handlers that depend on the model ---
|
||||
self.restructure_handler = AssetRestructureHandler(self.unified_model, self)
|
||||
|
||||
@ -143,16 +143,17 @@ class MainWindow(QMainWindow):
|
||||
# --- Load File Type Definitions for Rule Editor ---
|
||||
file_type_keys = []
|
||||
try:
|
||||
# Access configuration directly from the stored object using public methods
|
||||
file_type_defs = self.config.get_file_type_definitions_with_examples()
|
||||
file_type_keys = list(file_type_defs.keys())
|
||||
base_cfg_data = load_base_config()
|
||||
if base_cfg_data and "FILE_TYPE_DEFINITIONS" in base_cfg_data:
|
||||
file_type_keys = list(base_cfg_data["FILE_TYPE_DEFINITIONS"].keys())
|
||||
log.info(f"Loaded {len(file_type_keys)} FILE_TYPE_DEFINITIONS keys for RuleEditor.")
|
||||
else:
|
||||
log.warning("FILE_TYPE_DEFINITIONS not found in base_config. RuleEditor item_type dropdown might be empty.")
|
||||
except Exception as e:
|
||||
log.exception(f"Error loading FILE_TYPE_DEFINITIONS for RuleEditor: {e}")
|
||||
file_type_keys = [] # Ensure it's a list even on error
|
||||
|
||||
# Instantiate MainPanelWidget, passing the config, model, self (MainWindow) for context, and file_type_keys
|
||||
self.main_panel_widget = MainPanelWidget(config=self.config, unified_model=self.unified_model, parent=self, file_type_keys=file_type_keys)
|
||||
# Instantiate MainPanelWidget, passing the model, self (MainWindow) for context, and file_type_keys
|
||||
self.main_panel_widget = MainPanelWidget(self.unified_model, self, file_type_keys=file_type_keys)
|
||||
self.log_console = LogConsoleWidget(self)
|
||||
|
||||
# --- Create Left Pane with Static Selector and Stacked Editor ---
|
||||
@ -214,8 +215,8 @@ class MainWindow(QMainWindow):
|
||||
}
|
||||
self.qt_key_to_ftd_map = {}
|
||||
try:
|
||||
# Access configuration directly from the stored object using public methods
|
||||
file_type_defs = self.config.get_file_type_definitions_with_examples()
|
||||
base_settings = load_base_config()
|
||||
file_type_defs = base_settings.get('FILE_TYPE_DEFINITIONS', {})
|
||||
for ftd_key, ftd_value in file_type_defs.items():
|
||||
if isinstance(ftd_value, dict) and 'keybind' in ftd_value:
|
||||
char_key = ftd_value['keybind']
|
||||
@ -786,8 +787,7 @@ class MainWindow(QMainWindow):
|
||||
|
||||
if RuleBasedPredictionHandler and self.prediction_thread is None:
|
||||
self.prediction_thread = QThread(self)
|
||||
# Pass the Configuration object to the prediction handler
|
||||
self.prediction_handler = RuleBasedPredictionHandler(config_obj=self.config, input_source_identifier="", original_input_paths=[], preset_name="")
|
||||
self.prediction_handler = RuleBasedPredictionHandler(input_source_identifier="", original_input_paths=[], preset_name="")
|
||||
self.prediction_handler.moveToThread(self.prediction_thread)
|
||||
|
||||
self.start_prediction_signal.connect(self.prediction_handler.run_prediction, Qt.ConnectionType.QueuedConnection)
|
||||
@ -1340,7 +1340,6 @@ def run_gui():
|
||||
"""Initializes and runs the Qt application."""
|
||||
print("--- Reached run_gui() ---")
|
||||
from PySide6.QtGui import QKeySequence
|
||||
from configuration import Configuration # Import Configuration here for instantiation
|
||||
|
||||
app = QApplication(sys.argv)
|
||||
|
||||
@ -1352,16 +1351,7 @@ def run_gui():
|
||||
|
||||
app.setPalette(palette)
|
||||
|
||||
# Create a Configuration instance and pass it to MainWindow
|
||||
try:
|
||||
config = Configuration()
|
||||
log.info("Configuration loaded successfully for GUI.")
|
||||
except Exception as e:
|
||||
log.critical(f"Failed to load configuration for GUI: {e}")
|
||||
QMessageBox.critical(None, "Configuration Error", f"Failed to load application configuration:\n{e}\n\nApplication will exit.")
|
||||
sys.exit(1) # Exit if configuration fails
|
||||
|
||||
window = MainWindow(config)
|
||||
window = MainWindow()
|
||||
window.show()
|
||||
sys.exit(app.exec())
|
||||
|
||||
|
||||
@ -6,7 +6,7 @@ import re
|
||||
import tempfile
|
||||
import zipfile
|
||||
from collections import defaultdict, Counter
|
||||
from typing import List, Dict, Any, Set, Tuple # Added Set, Tuple
|
||||
from typing import List, Dict, Any
|
||||
|
||||
# --- PySide6 Imports ---
|
||||
from PySide6.QtCore import QObject, Slot # Keep QObject for parent type hint, Slot for classify_files if kept as method
|
||||
@ -303,19 +303,17 @@ class RuleBasedPredictionHandler(BasePredictionHandler):
|
||||
Inherits from BasePredictionHandler for common threading and signaling.
|
||||
"""
|
||||
|
||||
def __init__(self, config_obj: Configuration, input_source_identifier: str, original_input_paths: list[str], preset_name: str, parent: QObject = None):
|
||||
def __init__(self, input_source_identifier: str, original_input_paths: list[str], preset_name: str, parent: QObject = None):
|
||||
"""
|
||||
Initializes the rule-based handler with a Configuration object.
|
||||
Initializes the rule-based handler.
|
||||
|
||||
Args:
|
||||
config_obj: The main configuration object.
|
||||
input_source_identifier: The unique identifier for the input source (e.g., file path).
|
||||
original_input_paths: List of absolute file paths extracted from the source.
|
||||
preset_name: The name of the preset configuration to use.
|
||||
parent: The parent QObject.
|
||||
"""
|
||||
super().__init__(input_source_identifier, parent)
|
||||
self.config = config_obj # Store the Configuration object
|
||||
self.original_input_paths = original_input_paths
|
||||
self.preset_name = preset_name
|
||||
self._current_input_path = None
|
||||
@ -364,24 +362,16 @@ class RuleBasedPredictionHandler(BasePredictionHandler):
|
||||
log.warning(f"Input source path does not exist: '{input_source_identifier}'. Skipping prediction.")
|
||||
raise FileNotFoundError(f"Input source path not found: {input_source_identifier}")
|
||||
|
||||
# --- Use Provided Configuration ---
|
||||
# The Configuration object is now passed during initialization.
|
||||
# Ensure the correct preset is loaded in the passed config object if necessary,
|
||||
# or rely on the caller (MainWindow) to ensure the config object is in the correct state.
|
||||
# MainWindow's load_preset method re-initializes the config, so it should be correct.
|
||||
# We just need to use the stored self.config.
|
||||
log.info(f"Using provided configuration object for preset '{preset_name}'.")
|
||||
# No need to create a new Configuration instance here.
|
||||
# config = Configuration(preset_name) # REMOVED
|
||||
# log.info(f"Successfully loaded configuration for preset '{preset_name}'.") # REMOVED
|
||||
# --- Load Configuration ---
|
||||
config = Configuration(preset_name)
|
||||
log.info(f"Successfully loaded configuration for preset '{preset_name}'.")
|
||||
|
||||
if self._is_cancelled: raise RuntimeError("Prediction cancelled before classification.")
|
||||
|
||||
# --- Perform Classification ---
|
||||
self.status_update.emit(f"Classifying files for '{source_path.name}'...")
|
||||
try:
|
||||
# Use the stored config object
|
||||
classified_assets = classify_files(original_input_paths, self.config)
|
||||
classified_assets = classify_files(original_input_paths, config)
|
||||
except Exception as e:
|
||||
log.exception(f"Error during file classification for source '{input_source_identifier}': {e}")
|
||||
raise RuntimeError(f"Error classifying files: {e}") from e
|
||||
@ -398,29 +388,26 @@ class RuleBasedPredictionHandler(BasePredictionHandler):
|
||||
# --- Build the Hierarchy ---
|
||||
self.status_update.emit(f"Building rule hierarchy for '{source_path.name}'...")
|
||||
try:
|
||||
# Use the stored config object
|
||||
supplier_identifier = self.config.supplier_name
|
||||
supplier_identifier = config.supplier_name
|
||||
source_rule = SourceRule(
|
||||
input_path=input_source_identifier,
|
||||
supplier_identifier=supplier_identifier,
|
||||
# Use the internal display name from the stored config object
|
||||
preset_name=self.config.internal_display_preset_name
|
||||
# Use the internal display name from the config object
|
||||
preset_name=config.internal_display_preset_name
|
||||
)
|
||||
asset_rules = []
|
||||
# Access file type definitions via the public getter method from the stored config object
|
||||
file_type_definitions = self.config.get_file_type_definitions_with_examples()
|
||||
file_type_definitions = config._core_settings.get('FILE_TYPE_DEFINITIONS', {})
|
||||
|
||||
for asset_name, files_info in classified_assets.items():
|
||||
if self._is_cancelled: raise RuntimeError("Prediction cancelled during hierarchy building (assets).")
|
||||
if not files_info: continue
|
||||
|
||||
# Use the stored config object
|
||||
asset_category_rules = self.config.asset_category_rules
|
||||
asset_type_definitions = self.config.get_asset_type_definitions()
|
||||
asset_category_rules = config.asset_category_rules
|
||||
asset_type_definitions = config.get_asset_type_definitions()
|
||||
asset_type_keys = list(asset_type_definitions.keys())
|
||||
|
||||
# Initialize predicted_asset_type using the validated default from stored config
|
||||
predicted_asset_type = self.config.default_asset_category
|
||||
# Initialize predicted_asset_type using the validated default
|
||||
predicted_asset_type = config.default_asset_category
|
||||
log.debug(f"Asset '{asset_name}': Initial predicted_asset_type set to default: '{predicted_asset_type}'.")
|
||||
|
||||
# 1. Check asset_category_rules from preset
|
||||
@ -428,8 +415,7 @@ class RuleBasedPredictionHandler(BasePredictionHandler):
|
||||
|
||||
# Check for Model type based on file patterns
|
||||
if "Model" in asset_type_keys:
|
||||
# Use the stored config object
|
||||
model_patterns_regex = self.config.compiled_model_regex
|
||||
model_patterns_regex = config.compiled_model_regex
|
||||
for f_info in files_info:
|
||||
if f_info['item_type'] in ["EXTRA", "FILE_IGNORE"]:
|
||||
continue
|
||||
@ -461,13 +447,12 @@ class RuleBasedPredictionHandler(BasePredictionHandler):
|
||||
pass
|
||||
|
||||
# 2. If not determined by specific rules, check for Surface (if not Model/Decal by rule)
|
||||
if not determined_by_rule and predicted_asset_type == self.config.default_asset_category and "Surface" in asset_type_keys:
|
||||
if not determined_by_rule and predicted_asset_type == config.default_asset_category and "Surface" in asset_type_keys:
|
||||
item_types_in_asset = {f_info['item_type'] for f_info in files_info}
|
||||
# Ensure we are checking against standard map types from FILE_TYPE_DEFINITIONS
|
||||
# This check is primarily for PBR texture sets.
|
||||
# Use the stored config object
|
||||
material_indicators = {
|
||||
ft_key for ft_key, ft_def in self.config.get_file_type_definitions_with_examples().items()
|
||||
ft_key for ft_key, ft_def in config.get_file_type_definitions_with_examples().items()
|
||||
if ft_def.get('standard_type') and ft_def.get('standard_type') not in ["", "EXTRA", "FILE_IGNORE", "MODEL"]
|
||||
}
|
||||
# Add common direct standard types as well for robustness
|
||||
@ -481,7 +466,7 @@ class RuleBasedPredictionHandler(BasePredictionHandler):
|
||||
has_material_map = True
|
||||
break
|
||||
# Check standard type if item_type is a key in FILE_TYPE_DEFINITIONS
|
||||
item_def = self.config.get_file_type_definitions_with_examples().get(item_type)
|
||||
item_def = config.get_file_type_definitions_with_examples().get(item_type)
|
||||
if item_def and item_def.get('standard_type') in material_indicators:
|
||||
has_material_map = True
|
||||
break
|
||||
@ -493,8 +478,8 @@ class RuleBasedPredictionHandler(BasePredictionHandler):
|
||||
# 3. Final validation: Ensure predicted_asset_type is a valid key.
|
||||
if predicted_asset_type not in asset_type_keys:
|
||||
log.warning(f"Derived AssetType '{predicted_asset_type}' for asset '{asset_name}' is not in ASSET_TYPE_DEFINITIONS. "
|
||||
f"Falling back to default: '{self.config.default_asset_category}'.")
|
||||
predicted_asset_type = self.config.default_asset_category
|
||||
f"Falling back to default: '{config.default_asset_category}'.")
|
||||
predicted_asset_type = config.default_asset_category
|
||||
|
||||
asset_rule = AssetRule(asset_name=asset_name, asset_type=predicted_asset_type)
|
||||
file_rules = []
|
||||
@ -509,8 +494,7 @@ class RuleBasedPredictionHandler(BasePredictionHandler):
|
||||
# No need for the old MAP_ prefixing logic here.
|
||||
|
||||
# Validate the final_item_type against definitions, unless it's EXTRA or FILE_IGNORE
|
||||
# Use the stored config object
|
||||
if final_item_type not in ["EXTRA", "FILE_IGNORE"] and self.config.get_file_type_definitions_with_examples() and final_item_type not in self.config.get_file_type_definitions_with_examples():
|
||||
if final_item_type not in ["EXTRA", "FILE_IGNORE"] and file_type_definitions and final_item_type not in file_type_definitions:
|
||||
log.warning(f"Predicted ItemType '{final_item_type}' for file '{file_info['file_path']}' is not in FILE_TYPE_DEFINITIONS. Setting to FILE_IGNORE.")
|
||||
final_item_type = "FILE_IGNORE"
|
||||
|
||||
|
||||
@ -1,12 +1,12 @@
|
||||
# gui/unified_view_model.py
|
||||
import logging
|
||||
log = logging.getLogger(__name__)
|
||||
from PySide6.QtCore import QAbstractItemModel, QModelIndex, Qt, Signal, Slot, QMimeData, QByteArray, QDataStream, QIODevice, QPersistentModelIndex
|
||||
from PySide6.QtCore import QAbstractItemModel, QModelIndex, Qt, Signal, Slot, QMimeData, QByteArray, QDataStream, QIODevice
|
||||
from PySide6.QtGui import QColor
|
||||
from pathlib import Path
|
||||
from rule_structure import SourceRule, AssetRule, FileRule
|
||||
from configuration import load_base_config
|
||||
from typing import List
|
||||
from configuration import Configuration # Import Configuration class
|
||||
|
||||
class CustomRoles:
|
||||
MapTypeRole = Qt.UserRole + 1
|
||||
@ -46,9 +46,8 @@ class UnifiedViewModel(QAbstractItemModel):
|
||||
# --- Drag and Drop MIME Type ---
|
||||
MIME_TYPE = "application/x-filerule-index-list"
|
||||
|
||||
def __init__(self, config: Configuration, parent=None):
|
||||
def __init__(self, parent=None):
|
||||
super().__init__(parent)
|
||||
self._config = config # Store the Configuration object
|
||||
self._source_rules = []
|
||||
# self._display_mode removed
|
||||
self._asset_type_colors = {}
|
||||
@ -60,9 +59,9 @@ class UnifiedViewModel(QAbstractItemModel):
|
||||
def _load_definitions(self):
|
||||
"""Loads configuration and caches colors and type keys."""
|
||||
try:
|
||||
# Access configuration directly from the stored object using public methods
|
||||
asset_type_defs = self._config.get_asset_type_definitions()
|
||||
file_type_defs = self._config.get_file_type_definitions_with_examples()
|
||||
base_config = load_base_config()
|
||||
asset_type_defs = base_config.get('ASSET_TYPE_DEFINITIONS', {})
|
||||
file_type_defs = base_config.get('FILE_TYPE_DEFINITIONS', {})
|
||||
|
||||
# Cache Asset Type Definitions (Keys and Colors)
|
||||
self._asset_type_keys = sorted(list(asset_type_defs.keys()))
|
||||
@ -906,23 +905,37 @@ class UnifiedViewModel(QAbstractItemModel):
|
||||
encoded_data = QByteArray()
|
||||
stream = QDataStream(encoded_data, QIODevice.OpenModeFlag.WriteOnly)
|
||||
|
||||
# Store QPersistentModelIndex for robustness
|
||||
# Collect file paths of dragged FileRule items
|
||||
file_paths = []
|
||||
dragged_file_info = []
|
||||
for index in indexes:
|
||||
if index.isValid() and index.column() == 0:
|
||||
if not index.isValid() or index.column() != 0:
|
||||
continue
|
||||
item = index.internalPointer()
|
||||
if isinstance(item, FileRule):
|
||||
file_paths.append(item.file_path)
|
||||
log.debug(f"mimeData: Added file path for file: {Path(item.file_path).name}")
|
||||
parent_index = self.parent(index)
|
||||
if parent_index.isValid():
|
||||
# Store: source_row, source_parent_row, source_grandparent_row
|
||||
# This allows reconstructing the index later
|
||||
grandparent_index = self.parent(parent_index)
|
||||
# Ensure grandparent_index is valid before accessing its row
|
||||
if grandparent_index.isValid():
|
||||
dragged_file_info.append((index.row(), parent_index.row(), grandparent_index.row()))
|
||||
else:
|
||||
# Handle case where grandparent is the root (shouldn't happen for FileRule, but safety)
|
||||
# Or if parent() failed unexpectedly
|
||||
log.warning(f"mimeData: Could not get valid grandparent index for FileRule at row {index.row()}, parent row {parent_index.row()}")
|
||||
|
||||
# Write the number of items first, then each file path string
|
||||
stream.writeInt32(len(file_paths)) # Use writeInt32 for potentially more items
|
||||
for file_path in file_paths:
|
||||
stream.writeQString(file_path) # Use writeQString for strings
|
||||
else:
|
||||
log.warning(f"mimeData: Could not get parent index for FileRule at row {index.row()}")
|
||||
|
||||
# Write the number of items first, then each tuple
|
||||
stream.writeInt8(len(dragged_file_info))
|
||||
for info in dragged_file_info:
|
||||
stream.writeInt8(info[0])
|
||||
stream.writeInt8(info[1])
|
||||
stream.writeInt8(info[2])
|
||||
|
||||
mime_data.setData(self.MIME_TYPE, encoded_data)
|
||||
log.debug(f"mimeData: Encoded {len(file_paths)} FileRule file paths.")
|
||||
log.debug(f"mimeData: Encoded {len(dragged_file_info)} FileRule indices.")
|
||||
return mime_data
|
||||
|
||||
def canDropMimeData(self, data: QMimeData, action: Qt.DropAction, row: int, column: int, parent: QModelIndex) -> bool:
|
||||
@ -957,68 +970,75 @@ class UnifiedViewModel(QAbstractItemModel):
|
||||
encoded_data = data.data(self.MIME_TYPE)
|
||||
stream = QDataStream(encoded_data, QIODevice.OpenModeFlag.ReadOnly)
|
||||
|
||||
# Read file paths from the stream
|
||||
dragged_file_paths = []
|
||||
num_items = stream.readInt32()
|
||||
log.debug(f"dropMimeData: Decoding {num_items} file paths.")
|
||||
num_items = stream.readInt8()
|
||||
source_indices_info = []
|
||||
for _ in range(num_items):
|
||||
dragged_file_paths.append(stream.readQString()) # Use readQString for strings
|
||||
source_row = stream.readInt8()
|
||||
source_parent_row = stream.readInt8()
|
||||
source_grandparent_row = stream.readInt8()
|
||||
source_indices_info.append((source_row, source_parent_row, source_grandparent_row))
|
||||
|
||||
log.debug(f"dropMimeData: Decoded {len(dragged_file_paths)} file paths. Target Asset: '{target_asset_item.asset_name}'")
|
||||
log.debug(f"dropMimeData: Decoded {len(source_indices_info)} source indices. Target Asset: '{target_asset_item.asset_name}'")
|
||||
|
||||
if not dragged_file_paths:
|
||||
log.warning("dropMimeData: No file path information decoded.")
|
||||
return False
|
||||
|
||||
# Find the current FileRule objects and their indices based on file paths
|
||||
dragged_items_with_indices = []
|
||||
for file_path in dragged_file_paths:
|
||||
found_item = None
|
||||
found_index = QModelIndex()
|
||||
# Iterate through the model to find the FileRule object by file_path
|
||||
for sr_row, source_rule in enumerate(self._source_rules):
|
||||
for ar_row, asset_rule in enumerate(source_rule.assets):
|
||||
for fr_row, file_rule in enumerate(asset_rule.files):
|
||||
if file_rule.file_path == file_path:
|
||||
found_item = file_rule
|
||||
# Get the current index for this item
|
||||
parent_asset_index = self.index(ar_row, 0, self.createIndex(sr_row, 0, source_rule))
|
||||
if parent_asset_index.isValid():
|
||||
found_index = self.index(fr_row, 0, parent_asset_index)
|
||||
if found_index.isValid():
|
||||
dragged_items_with_indices.append((found_item, found_index))
|
||||
log.debug(f"dropMimeData: Found item and index for file: {Path(file_path).name}")
|
||||
else:
|
||||
log.warning(f"dropMimeData: Could not get valid index for found file item: {Path(file_path).name}")
|
||||
else:
|
||||
log.warning(f"dropMimeData: Could not get valid parent asset index for found file item: {Path(file_path).name}")
|
||||
break # Found the file rule, move to the next dragged file path
|
||||
if found_item: break # Found the file rule, move to the next dragged file path
|
||||
if found_item: break # Found the file rule, move to the next dragged file path
|
||||
|
||||
if not found_item:
|
||||
log.warning(f"dropMimeData: Could not find FileRule item for path: {file_path}. Skipping.")
|
||||
|
||||
|
||||
if not dragged_items_with_indices:
|
||||
log.warning("dropMimeData: No valid FileRule items found in the model for the dragged paths.")
|
||||
if not source_indices_info:
|
||||
log.warning("dropMimeData: No valid source index information decoded.")
|
||||
return False
|
||||
|
||||
# Keep track of original parents that might become empty
|
||||
original_parents_to_check = set()
|
||||
original_parents = set()
|
||||
moved_files_new_indices = {}
|
||||
|
||||
# Process moves using the retrieved items and their current indices
|
||||
for file_item, source_file_index in dragged_items_with_indices:
|
||||
# Track original parent for cleanup using the parent back-reference
|
||||
old_parent_asset = getattr(file_item, 'parent_asset', None)
|
||||
if old_parent_asset and isinstance(old_parent_asset, AssetRule):
|
||||
source_rule = getattr(old_parent_asset, 'parent_source', None)
|
||||
if source_rule:
|
||||
# Store a hashable representation (tuple of identifiers)
|
||||
original_parents_to_check.add((source_rule.input_path, old_parent_asset.asset_name))
|
||||
# --- BEGIN FIX: Reconstruct all source indices BEFORE the move loop ---
|
||||
source_indices_to_process = []
|
||||
log.debug("Reconstructing initial source indices...")
|
||||
for src_row, src_parent_row, src_grandparent_row in source_indices_info:
|
||||
grandparent_index = self.index(src_grandparent_row, 0, QModelIndex())
|
||||
if not grandparent_index.isValid():
|
||||
log.error(f"dropMimeData: Failed initial reconstruction of grandparent index (row {src_grandparent_row}). Skipping item.")
|
||||
continue
|
||||
old_parent_index = self.index(src_parent_row, 0, grandparent_index)
|
||||
if not old_parent_index.isValid():
|
||||
log.error(f"dropMimeData: Failed initial reconstruction of old parent index (row {src_parent_row}). Skipping item.")
|
||||
continue
|
||||
source_file_index = self.index(src_row, 0, old_parent_index)
|
||||
if not source_file_index.isValid():
|
||||
# Log the specific parent it failed under for better debugging
|
||||
parent_name = getattr(old_parent_index.internalPointer(), 'asset_name', 'Unknown Parent')
|
||||
log.error(f"dropMimeData: Failed initial reconstruction of source file index (original row {src_row}) under parent '{parent_name}'. Skipping item.")
|
||||
continue
|
||||
|
||||
# Check if the reconstructed index actually points to a FileRule
|
||||
item_check = source_file_index.internalPointer()
|
||||
if isinstance(item_check, FileRule):
|
||||
source_indices_to_process.append(source_file_index)
|
||||
log.debug(f" Successfully reconstructed index for file: {Path(item_check.file_path).name}")
|
||||
else:
|
||||
log.warning(f"dropMimeData: Original parent asset '{old_parent_asset.asset_name}' has no parent source reference for cleanup tracking.")
|
||||
log.warning(f"dropMimeData: Initial reconstructed index (row {src_row}) does not point to a FileRule. Skipping.")
|
||||
|
||||
log.debug(f"Successfully reconstructed {len(source_indices_to_process)} valid source indices.")
|
||||
# --- END FIX ---
|
||||
|
||||
|
||||
# Process moves using the pre-calculated valid indices
|
||||
for source_file_index in source_indices_to_process:
|
||||
# Get the file item (already validated during reconstruction)
|
||||
file_item = source_file_index.internalPointer()
|
||||
|
||||
# Track original parent for cleanup (using the valid index)
|
||||
old_parent_index = self.parent(source_file_index)
|
||||
if old_parent_index.isValid():
|
||||
old_parent_asset = old_parent_index.internalPointer()
|
||||
if isinstance(old_parent_asset, AssetRule):
|
||||
# Need grandparent row for the tuple key
|
||||
grandparent_index = self.parent(old_parent_index)
|
||||
if grandparent_index.isValid():
|
||||
original_parents.add((grandparent_index.row(), old_parent_asset.asset_name))
|
||||
else:
|
||||
log.warning(f"Could not get grandparent index for original parent '{old_parent_asset.asset_name}' during cleanup tracking.")
|
||||
else:
|
||||
log.warning(f"Parent of file '{Path(file_item.file_path).name}' is not an AssetRule.")
|
||||
else:
|
||||
log.warning(f"Could not get valid parent index for file '{Path(file_item.file_path).name}' during cleanup tracking.")
|
||||
|
||||
|
||||
# Perform the move using the model's method and the valid source_file_index
|
||||
@ -1030,25 +1050,15 @@ class UnifiedViewModel(QAbstractItemModel):
|
||||
if file_item.target_asset_name_override != target_asset_item.asset_name:
|
||||
log.debug(f" Updating target override for '{Path(file_item.file_path).name}' to '{target_asset_item.asset_name}'")
|
||||
file_item.target_asset_name_override = target_asset_item.asset_name
|
||||
# Need the *new* index of the moved file to emit dataChanged AND the override changed signal
|
||||
# Need the *new* index of the moved file to emit dataChanged
|
||||
try:
|
||||
# Find the new row of the file item within the target parent's list
|
||||
new_row = target_asset_item.files.index(file_item)
|
||||
# Create the index for the target asset column (for dataChanged)
|
||||
new_file_index_col0 = self.index(new_row, 0, parent)
|
||||
new_file_index_target_col = self.index(new_row, self.COL_TARGET_ASSET, parent)
|
||||
if new_file_index_target_col.isValid():
|
||||
moved_files_new_indices[file_item.file_path] = new_file_index_target_col
|
||||
else:
|
||||
log.warning(f" Could not get valid *new* index for target column of moved file: {Path(file_item.file_path).name}")
|
||||
|
||||
# Emit the targetAssetOverrideChanged signal for the handler
|
||||
new_file_index_col_0 = self.index(new_row, 0, parent) # Index for column 0
|
||||
if new_file_index_col_0.isValid():
|
||||
self.targetAssetOverrideChanged.emit(file_item, target_asset_item.asset_name, new_file_index_col_0)
|
||||
log.debug(f" Emitted targetAssetOverrideChanged for '{Path(file_item.file_path).name}'")
|
||||
else:
|
||||
log.warning(f" Could not get valid *new* index for column 0 of moved file to emit signal: {Path(file_item.file_path).name}")
|
||||
|
||||
except ValueError:
|
||||
log.error(f" Could not find moved file '{Path(file_item.file_path).name}' in target parent's list after move.")
|
||||
|
||||
@ -1064,43 +1074,24 @@ class UnifiedViewModel(QAbstractItemModel):
|
||||
self.dataChanged.emit(new_index, new_index, [Qt.DisplayRole, Qt.EditRole])
|
||||
|
||||
# --- Cleanup: Remove any original parent AssetRules that are now empty ---
|
||||
log.debug(f"dropMimeData: Checking original parents for cleanup: {[f'{path}/{name}' for path, name in original_parents_to_check]}")
|
||||
# Convert set to list to iterate
|
||||
for source_path, asset_name_to_check in list(original_parents_to_check):
|
||||
found_asset_rule_to_check = None
|
||||
# Find the AssetRule object based on source_path and asset_name
|
||||
for source_rule in self._source_rules:
|
||||
if source_rule.input_path == source_path:
|
||||
for asset_rule in source_rule.assets:
|
||||
if asset_rule.asset_name == asset_name_to_check:
|
||||
found_asset_rule_to_check = asset_rule
|
||||
break
|
||||
if found_asset_rule_to_check: break
|
||||
|
||||
if found_asset_rule_to_check:
|
||||
log.debug(f"dropMimeData: Checking original parents for cleanup: {list(original_parents)}")
|
||||
for gp_row, asset_name in list(original_parents):
|
||||
try:
|
||||
# Re-check if the asset is still in the model and is now empty
|
||||
# Use parent back-reference to find the source rule (should be the same as source_rule found above)
|
||||
source_rule = getattr(found_asset_rule_to_check, 'parent_source', None)
|
||||
if source_rule:
|
||||
# Check if the asset rule is still in its parent's list
|
||||
if found_asset_rule_to_check in source_rule.assets:
|
||||
if not found_asset_rule_to_check.files and found_asset_rule_to_check is not target_asset_item:
|
||||
log.info(f"dropMimeData: Attempting cleanup of now empty original parent: '{found_asset_rule_to_check.asset_name}'")
|
||||
if not self.removeAssetRule(found_asset_rule_to_check):
|
||||
log.warning(f"dropMimeData: Failed to remove empty original parent '{found_asset_rule_to_check.asset_name}'.")
|
||||
elif found_asset_rule_to_check.files:
|
||||
log.debug(f"dropMimeData: Original parent '{found_asset_rule_to_check.asset_name}' is not empty after moves. Skipping cleanup.")
|
||||
# If it's the target asset, we don't remove it
|
||||
else:
|
||||
log.warning(f"dropMimeData: Cleanup check failed. Original parent asset '{found_asset_rule_to_check.asset_name}' not found in its source rule's list.")
|
||||
else:
|
||||
log.warning(f"dropMimeData: Cleanup check failed. Original parent asset '{found_asset_rule_to_check.asset_name}' has no parent source reference.")
|
||||
if 0 <= gp_row < len(self._source_rules):
|
||||
source_rule = self._source_rules[gp_row]
|
||||
# Find the asset rule within the correct source rule
|
||||
asset_rule_to_check = next((asset for asset in source_rule.assets if asset.asset_name == asset_name), None)
|
||||
|
||||
except Exception as e:
|
||||
log.exception(f"dropMimeData: Error during cleanup check for parent '{found_asset_rule_to_check.asset_name}': {e}")
|
||||
if asset_rule_to_check and not asset_rule_to_check.files and asset_rule_to_check != target_asset_item:
|
||||
log.info(f"dropMimeData: Attempting cleanup of now empty original parent: '{asset_rule_to_check.asset_name}'")
|
||||
if not self.removeAssetRule(asset_rule_to_check):
|
||||
log.warning(f"dropMimeData: Failed to remove empty original parent '{asset_rule_to_check.asset_name}'.")
|
||||
elif not asset_rule_to_check:
|
||||
log.warning(f"dropMimeData: Cleanup check failed. Could not find original parent asset '{asset_name}' in source rule at row {gp_row}.")
|
||||
else:
|
||||
log.warning(f"dropMimeData: Could not find original parent asset '{asset_name_to_check}' for cleanup.")
|
||||
log.warning(f"dropMimeData: Cleanup check failed. Invalid grandparent row index {gp_row} found in original_parents set.")
|
||||
except Exception as e:
|
||||
log.exception(f"dropMimeData: Error during cleanup check for parent '{asset_name}' (gp_row {gp_row}): {e}")
|
||||
|
||||
|
||||
return True
|
||||
360
main.py
360
main.py
@ -15,12 +15,11 @@ from typing import List, Dict, Tuple, Optional
|
||||
# --- Utility Imports ---
|
||||
from utils.hash_utils import calculate_sha256
|
||||
from utils.path_utils import get_next_incrementing_value
|
||||
from utils import app_setup_utils # Import the new utility module
|
||||
|
||||
# --- Qt Imports for Application Structure ---
|
||||
from PySide6.QtCore import QObject, Slot, QThreadPool, QRunnable, Signal
|
||||
from PySide6.QtCore import Qt
|
||||
from PySide6.QtWidgets import QApplication, QDialog # Import QDialog for the setup dialog
|
||||
from PySide6.QtWidgets import QApplication
|
||||
|
||||
# --- Backend Imports ---
|
||||
# Add current directory to sys.path for direct execution
|
||||
@ -46,10 +45,6 @@ try:
|
||||
from gui.main_window import MainWindow
|
||||
print("DEBUG: Successfully imported MainWindow.")
|
||||
|
||||
print("DEBUG: Attempting to import FirstTimeSetupDialog...")
|
||||
from gui.first_time_setup_dialog import FirstTimeSetupDialog # Import the setup dialog
|
||||
print("DEBUG: Successfully imported FirstTimeSetupDialog.")
|
||||
|
||||
print("DEBUG: Attempting to import prepare_processing_workspace...")
|
||||
from utils.workspace_utils import prepare_processing_workspace
|
||||
print("DEBUG: Successfully imported prepare_processing_workspace.")
|
||||
@ -252,7 +247,6 @@ class ProcessingTask(QRunnable):
|
||||
else:
|
||||
log.debug(f"No incrementing token found in pattern '{pattern}'. Skipping increment calculation.")
|
||||
next_increment_str = None # Or a default like "00" if downstream expects a string, but None is cleaner if handled.
|
||||
log.debug(f"Calculated next incrementing value for {output_dir}: {next_increment_str}")
|
||||
else:
|
||||
log.warning(f"Cannot calculate incrementing value: 'output_directory_pattern' not found in configuration for preset {config.preset_name}")
|
||||
except Exception as e:
|
||||
@ -306,61 +300,68 @@ class App(QObject):
|
||||
# Signal emitted when all queued processing tasks are complete
|
||||
all_tasks_finished = Signal(int, int, int) # processed_count, skipped_count, failed_count (Placeholder counts for now)
|
||||
|
||||
def __init__(self, user_config_path: str):
|
||||
def __init__(self):
|
||||
super().__init__()
|
||||
self.user_config_path = user_config_path # Store the determined user config path
|
||||
self.config_obj = None # Initialize config_obj to None
|
||||
self.processing_engine = None # Initialize processing_engine to None
|
||||
self.config_obj = None
|
||||
self.processing_engine = None
|
||||
self.main_window = None
|
||||
self.thread_pool = QThreadPool()
|
||||
self._active_tasks_count = 0
|
||||
self._task_results = {"processed": 0, "skipped": 0, "failed": 0}
|
||||
log.info(f"Maximum threads for pool: {self.thread_pool.maxThreadCount()}")
|
||||
|
||||
# Configuration, engine, and GUI are now initialized via load_preset
|
||||
log.debug("App initialized. Configuration, engine, and GUI will be loaded via load_preset.")
|
||||
self._load_config()
|
||||
self._init_engine()
|
||||
self._init_gui()
|
||||
|
||||
def _load_config(self, user_config_path: str, preset_name: str):
|
||||
"""
|
||||
Loads the configuration using the determined user config path and specified preset.
|
||||
Sets self.config_obj. Does NOT exit on failure; raises ConfigurationError.
|
||||
"""
|
||||
log.debug(f"App: Attempting to load configuration with user_config_path='{user_config_path}' and preset_name='{preset_name}'")
|
||||
def _load_config(self):
|
||||
"""Loads the base configuration using a default preset."""
|
||||
# The actual preset name comes from the GUI request later, but the engine
|
||||
# needs an initial valid configuration object.
|
||||
try:
|
||||
# Convert user_config_path string to a Path object before passing to Configuration
|
||||
user_config_path_obj = Path(user_config_path)
|
||||
# Instantiate Configuration with the determined user config path and the specified preset name
|
||||
self.config_obj = Configuration(preset_name=preset_name, base_dir_user_config=user_config_path_obj)
|
||||
log.info(f"App: Configuration loaded successfully with preset '{preset_name}'.")
|
||||
# Find the first available preset to use as a default
|
||||
preset_dir = Path(__file__).parent / "Presets"
|
||||
default_preset_name = None
|
||||
if preset_dir.is_dir():
|
||||
presets = sorted([f.stem for f in preset_dir.glob("*.json") if f.is_file() and not f.name.startswith('_')])
|
||||
if presets:
|
||||
default_preset_name = presets[0]
|
||||
log.info(f"Using first available preset as default for initial config: '{default_preset_name}'")
|
||||
|
||||
if not default_preset_name:
|
||||
# Fallback or raise error if no presets found
|
||||
log.error("No presets found in the 'Presets' directory. Cannot initialize default configuration.")
|
||||
# Option 1: Raise an error
|
||||
raise ConfigurationError("No presets found to load default configuration.")
|
||||
|
||||
self.config_obj = Configuration(preset_name=default_preset_name)
|
||||
log.info(f"Base configuration loaded using default preset '{default_preset_name}'.")
|
||||
except ConfigurationError as e:
|
||||
log.error(f"App: Failed to load configuration with preset '{preset_name}': {e}")
|
||||
self.config_obj = None # Ensure config_obj is None on failure
|
||||
raise # Re-raise the exception
|
||||
log.error(f"Fatal: Failed to load base configuration using default preset: {e}")
|
||||
# In a real app, show this error to the user before exiting
|
||||
sys.exit(1)
|
||||
except Exception as e:
|
||||
log.exception(f"App: Unexpected error loading configuration with preset '{preset_name}': {e}")
|
||||
self.config_obj = None # Ensure config_obj is None on failure
|
||||
raise # Re-raise unexpected errors
|
||||
log.exception(f"Fatal: Unexpected error loading configuration: {e}")
|
||||
sys.exit(1)
|
||||
|
||||
def _init_engine(self):
|
||||
"""Initializes the ProcessingEngine if config_obj is available."""
|
||||
"""Initializes the ProcessingEngine."""
|
||||
if self.config_obj:
|
||||
try:
|
||||
self.processing_engine = ProcessingEngine(self.config_obj)
|
||||
log.info("App: ProcessingEngine initialized.")
|
||||
log.info("ProcessingEngine initialized.")
|
||||
except Exception as e:
|
||||
log.exception(f"App: Failed to initialize ProcessingEngine: {e}")
|
||||
self.processing_engine = None # Ensure engine is None on failure
|
||||
# Depending on context, this might need to be a fatal error.
|
||||
# For now, log and set to None.
|
||||
log.exception(f"Fatal: Failed to initialize ProcessingEngine: {e}")
|
||||
# Show error and exit
|
||||
sys.exit(1)
|
||||
else:
|
||||
log.warning("App: Cannot initialize ProcessingEngine: config_obj is None.")
|
||||
self.processing_engine = None
|
||||
log.error("Fatal: Cannot initialize ProcessingEngine without configuration.")
|
||||
sys.exit(1)
|
||||
|
||||
def _init_gui(self):
|
||||
"""Initializes the MainWindow and connects signals if processing_engine is available."""
|
||||
if self.processing_engine and self.config_obj:
|
||||
# Pass the config object to MainWindow during initialization
|
||||
self.main_window = MainWindow(config=self.config_obj)
|
||||
"""Initializes the MainWindow and connects signals."""
|
||||
if self.processing_engine:
|
||||
self.main_window = MainWindow() # MainWindow now part of the App
|
||||
# Connect the signal from the GUI to the App's slot using QueuedConnection
|
||||
# Connect the signal from the MainWindow (which is triggered by the panel) to the App's slot
|
||||
connection_success = self.main_window.start_backend_processing.connect(self.on_processing_requested, Qt.ConnectionType.QueuedConnection)
|
||||
@ -371,53 +372,10 @@ class App(QObject):
|
||||
log.error("*********************************************************")
|
||||
# Connect the App's completion signal to the MainWindow's slot
|
||||
self.all_tasks_finished.connect(self.main_window.on_processing_finished)
|
||||
log.info("App: MainWindow initialized and signals connected.")
|
||||
log.info("MainWindow initialized and signals connected.")
|
||||
else:
|
||||
log.warning("App: Cannot initialize MainWindow: ProcessingEngine or config_obj is None.")
|
||||
self.main_window = None # Ensure main_window is None if initialization fails
|
||||
|
||||
def load_preset(self, preset_name: str):
|
||||
"""
|
||||
Loads the specified preset and re-initializes the configuration and processing engine.
|
||||
This is intended to be called after App initialization, e.g., by the GUI or autotest.
|
||||
"""
|
||||
log.info(f"App: Loading preset '{preset_name}'...")
|
||||
try:
|
||||
# Load the configuration with the specified preset
|
||||
self._load_config(self.user_config_path, preset_name)
|
||||
log.info(f"App: Configuration reloaded with preset '{preset_name}'.")
|
||||
|
||||
# Re-initialize the ProcessingEngine with the new configuration
|
||||
self._init_engine()
|
||||
log.info("App: ProcessingEngine re-initialized with new configuration.")
|
||||
|
||||
# Initialize GUI if it hasn't been already (e.g., in Autotest where it's needed after config)
|
||||
if not self.main_window:
|
||||
self._init_gui()
|
||||
if self.main_window:
|
||||
log.debug("App: MainWindow initialized after preset load.")
|
||||
else:
|
||||
log.error("App: Failed to initialize MainWindow after preset load.")
|
||||
else:
|
||||
# If GUI was already initialized (e.g., in GUI mode),
|
||||
# inform it about the config change if needed
|
||||
# (e.g., to update delegates or other config-dependent UI elements)
|
||||
# The MainWindow and its components (like UnifiedViewModel, MainPanelWidget)
|
||||
# already hold a reference to the config_obj.
|
||||
# If they need to react to a *change* in config_obj, they would need
|
||||
# a signal or a method call here.
|
||||
# For now, assume they access the updated self.config_obj directly when needed.
|
||||
log.debug("App: MainWindow already exists, assuming it will use the updated config_obj.")
|
||||
|
||||
|
||||
except ConfigurationError as e:
|
||||
log.error(f"App: Failed to load preset '{preset_name}': {e}")
|
||||
# Depending on context (GUI vs CLI/Autotest), this might need to be handled differently.
|
||||
# For Autotest, this is likely a fatal error. For GUI, show a message box.
|
||||
raise # Re-raise the exception to be caught by the caller (e.g., Autotest)
|
||||
except Exception as e:
|
||||
log.exception(f"App: Unexpected error loading preset '{preset_name}': {e}")
|
||||
raise # Re-raise unexpected errors
|
||||
log.error("Fatal: Cannot initialize MainWindow without ProcessingEngine.")
|
||||
sys.exit(1)
|
||||
|
||||
@Slot(list, dict) # Slot to receive List[SourceRule] and processing_settings dict
|
||||
def on_processing_requested(self, source_rules: list, processing_settings: dict):
|
||||
@ -428,98 +386,142 @@ class App(QObject):
|
||||
log.info(f"VERIFY: App.on_processing_requested received {len(source_rules)} rules.")
|
||||
for i, rule in enumerate(source_rules):
|
||||
log.debug(f" VERIFY Rule {i}: Input='{rule.input_path}', Assets={len(rule.assets)}")
|
||||
|
||||
if not self.processing_engine:
|
||||
log.error("Processing engine not available. Cannot process request.")
|
||||
if self.main_window:
|
||||
self.main_window.statusBar().showMessage("Error: Processing Engine not ready.", 5000)
|
||||
# Emit finished signal with failure counts if engine is not ready
|
||||
self.all_tasks_finished.emit(0, 0, len(source_rules))
|
||||
return
|
||||
|
||||
if not source_rules:
|
||||
log.warning("Processing requested with an empty rule list.")
|
||||
if self.main_window:
|
||||
self.main_window.statusBar().showMessage("No rules to process.", 3000)
|
||||
# Emit finished signal immediately if no rules
|
||||
self.all_tasks_finished.emit(0, 0, 0)
|
||||
return
|
||||
|
||||
# Reset task counter and results for this batch
|
||||
self._active_tasks_count = len(source_rules)
|
||||
self._task_results = {"processed": 0, "skipped": 0, "failed": 0}
|
||||
log.info(f"Initialized active task count to: {self._active_tasks_count}")
|
||||
log.debug(f"Initialized active task count to: {self._active_tasks_count}")
|
||||
|
||||
# Update GUI progress bar/status via MainPanelWidget
|
||||
if self.main_window and hasattr(self.main_window, 'main_panel_widget') and self.main_window.main_panel_widget:
|
||||
# Set maximum value of progress bar to total number of tasks
|
||||
self.main_window.main_panel_widget.progress_bar.setMaximum(self._active_tasks_count)
|
||||
self.main_window.main_panel_widget.update_progress_bar(0, self._active_tasks_count) # Start at 0
|
||||
else:
|
||||
log.warning("App: Cannot update progress bar, main_window or main_panel_widget not available.")
|
||||
self.main_window.main_panel_widget.progress_bar.setMaximum(len(source_rules))
|
||||
self.main_window.main_panel_widget.progress_bar.setValue(0)
|
||||
self.main_window.main_panel_widget.progress_bar.setFormat(f"0/{len(source_rules)} tasks")
|
||||
|
||||
# Extract processing settings
|
||||
output_dir = Path(processing_settings.get("output_dir"))
|
||||
overwrite = processing_settings.get("overwrite", False)
|
||||
# Workers setting is used by QThreadPool itself, not passed to individual tasks
|
||||
# blender_enabled, nodegroup_blend_path, materials_blend_path are not used by the engine directly,
|
||||
# they would be handled by a post-processing stage if implemented.
|
||||
# --- Get paths needed for ProcessingTask ---
|
||||
try:
|
||||
# Get output_dir from processing_settings passed from autotest.py
|
||||
output_base_path_str = processing_settings.get("output_dir")
|
||||
log.info(f"APP_DEBUG: Received output_dir in processing_settings: {output_base_path_str}")
|
||||
|
||||
if not output_base_path_str:
|
||||
log.error("Cannot queue tasks: Output directory path is empty in processing_settings.")
|
||||
# self.main_window.statusBar().showMessage("Error: Output directory cannot be empty.", 5000) # GUI specific
|
||||
return
|
||||
output_base_path = Path(output_base_path_str)
|
||||
# Basic validation - check if it's likely a valid path structure (doesn't guarantee existence/writability here)
|
||||
if not output_base_path.is_absolute():
|
||||
# Or attempt to resolve relative to workspace? For now, require absolute from GUI.
|
||||
log.warning(f"Output path '{output_base_path}' is not absolute. Processing might fail if relative path is not handled correctly by engine.")
|
||||
# Consider resolving: output_base_path = Path.cwd() / output_base_path # If relative paths are allowed
|
||||
|
||||
# Define workspace path (assuming main.py is in the project root)
|
||||
workspace_path = Path(__file__).parent.resolve()
|
||||
log.debug(f"Using Workspace Path: {workspace_path}")
|
||||
log.debug(f"Using Output Base Path: {output_base_path}")
|
||||
|
||||
except Exception as e:
|
||||
log.exception(f"Error getting/validating paths for processing task: {e}")
|
||||
self.main_window.statusBar().showMessage(f"Error preparing paths: {e}", 5000)
|
||||
return
|
||||
# --- End Get paths ---
|
||||
|
||||
|
||||
# Set max threads based on GUI setting
|
||||
worker_count = processing_settings.get('workers', 1)
|
||||
self.thread_pool.setMaxThreadCount(worker_count)
|
||||
log.info(f"Set thread pool max workers to: {worker_count}")
|
||||
|
||||
# Queue tasks in the thread pool
|
||||
log.debug("DEBUG: Entering task queuing loop.")
|
||||
for i, rule in enumerate(source_rules):
|
||||
if isinstance(rule, SourceRule):
|
||||
log.info(f"DEBUG Task {i+1}: Rule Input='{rule.input_path}', Supplier ID='{getattr(rule, 'supplier_identifier', 'Not Set')}', Preset='{getattr(rule, 'preset_name', 'Not Set')}'")
|
||||
log.debug(f"DEBUG: Preparing to queue task {i+1}/{len(source_rules)} for rule: {rule.input_path}")
|
||||
|
||||
# --- Create a new Configuration and Engine instance for this specific task ---
|
||||
task_engine = None
|
||||
try:
|
||||
# Get preset name from the rule, fallback to app's default if missing
|
||||
preset_name_for_task = getattr(rule, 'preset_name', None)
|
||||
if not preset_name_for_task:
|
||||
log.warning(f"Task {i+1} (Rule: {rule.input_path}): SourceRule missing preset_name. Falling back to default preset '{self.config_obj.preset_name}'.")
|
||||
preset_name_for_task = self.config_obj.preset_name
|
||||
|
||||
task_config = Configuration(preset_name=preset_name_for_task)
|
||||
task_engine = ProcessingEngine(task_config)
|
||||
log.debug(f"Task {i+1}: Created new ProcessingEngine instance with preset '{preset_name_for_task}'.")
|
||||
|
||||
except ConfigurationError as config_err:
|
||||
log.error(f"Task {i+1} (Rule: {rule.input_path}): Failed to load configuration for preset '{preset_name_for_task}': {config_err}. Skipping task.")
|
||||
self._active_tasks_count -= 1 # Decrement count as this task won't run
|
||||
self._task_results["failed"] += 1
|
||||
# Optionally update GUI status for this specific rule
|
||||
self.main_window.update_file_status(str(rule.input_path), "failed", f"Config Error: {config_err}")
|
||||
continue # Skip to the next rule
|
||||
except Exception as engine_err:
|
||||
log.exception(f"Task {i+1} (Rule: {rule.input_path}): Failed to initialize ProcessingEngine for preset '{preset_name_for_task}': {engine_err}. Skipping task.")
|
||||
self._active_tasks_count -= 1 # Decrement count
|
||||
self._task_results["failed"] += 1
|
||||
self.main_window.update_file_status(str(rule.input_path), "failed", f"Engine Init Error: {engine_err}")
|
||||
continue # Skip to the next rule
|
||||
|
||||
if task_engine is None: # Should not happen if exceptions are caught, but safety check
|
||||
log.error(f"Task {i+1} (Rule: {rule.input_path}): Engine is None after initialization attempt. Skipping task.")
|
||||
self._active_tasks_count -= 1 # Decrement count
|
||||
self._task_results["failed"] += 1
|
||||
self.main_window.update_file_status(str(rule.input_path), "failed", "Engine initialization failed (unknown reason).")
|
||||
continue # Skip to the next rule
|
||||
# --- End Engine Instantiation ---
|
||||
|
||||
# Submit tasks to the thread pool
|
||||
log.info(f"Submitting {len(source_rules)} processing tasks to the thread pool.")
|
||||
for rule in source_rules:
|
||||
# Create a ProcessingTask for each SourceRule
|
||||
# workspace_path, incrementing_value, and sha5_value are calculated within ProcessingTask.run
|
||||
task = ProcessingTask(
|
||||
engine=self.processing_engine,
|
||||
engine=task_engine,
|
||||
rule=rule,
|
||||
workspace_path=Path(rule.input_path), # Pass the original input path for workspace preparation
|
||||
output_base_path=output_dir
|
||||
workspace_path=workspace_path,
|
||||
output_base_path=output_base_path # This is Path(output_base_path_str)
|
||||
)
|
||||
# Connect the task's finished signal to the App's slot
|
||||
log.info(f"APP_DEBUG: Passing to ProcessingTask: output_base_path = {output_base_path}")
|
||||
task.signals.finished.connect(self._on_task_finished)
|
||||
# Start the task in the thread pool
|
||||
log.debug(f"DEBUG: Calling thread_pool.start() for task {i+1}")
|
||||
self.thread_pool.start(task)
|
||||
log.debug(f"Submitted task for rule: {rule.input_path}")
|
||||
log.debug(f"DEBUG: Returned from thread_pool.start() for task {i+1}")
|
||||
else:
|
||||
log.warning(f"Skipping invalid item (index {i}) in rule list: {type(rule)}")
|
||||
|
||||
log.info("All processing tasks submitted to thread pool.")
|
||||
log.info(f"Queued {len(source_rules)} processing tasks (finished loop).")
|
||||
# GUI status already updated in MainWindow when signal was emitted
|
||||
|
||||
@Slot(str, str, object) # rule_input_path, status, result/error
|
||||
def _on_task_finished(self, rule_input_path: str, status: str, result_or_error: object):
|
||||
"""Slot to handle the completion of an individual processing task."""
|
||||
log.debug(f"DEBUG: App._on_task_finished slot entered for rule: {rule_input_path} with status: {status}")
|
||||
|
||||
# Decrement the active task count
|
||||
# --- Slot to handle completion of individual tasks ---
|
||||
@Slot(str, str, object)
|
||||
def _on_task_finished(self, rule_input_path, status, result_or_error):
|
||||
"""Handles the 'finished' signal from a ProcessingTask."""
|
||||
log.info(f"Task finished signal received for {rule_input_path}. Status: {status}")
|
||||
self._active_tasks_count -= 1
|
||||
log.debug(f"Active tasks remaining: {self._active_tasks_count}")
|
||||
|
||||
# Update task results based on status
|
||||
# Update overall results (basic counts for now)
|
||||
if status == "processed":
|
||||
self._task_results["processed"] += 1
|
||||
elif status == "skipped":
|
||||
elif status == "skipped": # Assuming engine might return 'skipped' status eventually
|
||||
self._task_results["skipped"] += 1
|
||||
elif status.startswith("failed"): # Catches "failed_preparation" and "failed_processing"
|
||||
else: # Count all other statuses (failed_preparation, failed_processing) as failed
|
||||
self._task_results["failed"] += 1
|
||||
log.error(f"Task failed for {rule_input_path}: {result_or_error}")
|
||||
else:
|
||||
log.warning(f"Task finished with unknown status '{status}' for {rule_input_path}. Treating as failed.")
|
||||
self._task_results["failed"] += 1
|
||||
log.error(f"Task with unknown status failed for {rule_input_path}: {result_or_error}")
|
||||
|
||||
log.info(f"Task finished for {rule_input_path}. Status: {status}. Remaining tasks: {self._active_tasks_count}")
|
||||
log.debug(f"Current task results: Processed={self._task_results['processed']}, Skipped={self._task_results['skipped']}, Failed={self._task_results['failed']}")
|
||||
# Update progress bar via MainPanelWidget
|
||||
total_tasks = self.main_window.main_panel_widget.progress_bar.maximum()
|
||||
completed_tasks = total_tasks - self._active_tasks_count
|
||||
self.main_window.main_panel_widget.update_progress_bar(completed_tasks, total_tasks) # Use MainPanelWidget's method
|
||||
|
||||
# Update GUI progress bar
|
||||
if self.main_window and hasattr(self.main_window, 'main_panel_widget') and self.main_window.main_panel_widget:
|
||||
completed_tasks = self._task_results["processed"] + self._task_results["skipped"] + self._task_results["failed"]
|
||||
self.main_window.main_panel_widget.update_progress_bar(completed_tasks, self._task_results["processed"] + self._task_results["skipped"] + self._task_results["failed"] + self._active_tasks_count) # Update with current counts
|
||||
# Update status text if needed (e.g., "Processing X of Y...")
|
||||
self.main_window.main_panel_widget.set_progress_bar_text(f"Processing: {completed_tasks}/{self._task_results['processed'] + self._task_results['skipped'] + self._task_results['failed'] + self._active_tasks_count}")
|
||||
else:
|
||||
log.warning("App: Cannot update progress bar in _on_task_finished, main_window or main_panel_widget not available.")
|
||||
# Update status for the specific file in the GUI (if needed)
|
||||
|
||||
|
||||
# Check if all tasks are finished
|
||||
if self._active_tasks_count <= 0: # Use <= 0 to handle potential errors leading to negative count
|
||||
if self._active_tasks_count == 0:
|
||||
log.info("All processing tasks finished.")
|
||||
# Emit the signal with the final counts
|
||||
self.all_tasks_finished.emit(
|
||||
@ -527,9 +529,6 @@ class App(QObject):
|
||||
self._task_results["skipped"],
|
||||
self._task_results["failed"]
|
||||
)
|
||||
# Reset task count to 0 explicitly
|
||||
self._active_tasks_count = 0
|
||||
log.debug("Emitted all_tasks_finished signal.")
|
||||
elif self._active_tasks_count < 0:
|
||||
log.error("Error: Active task count went below zero!") # Should not happen
|
||||
|
||||
@ -541,14 +540,6 @@ class App(QObject):
|
||||
else:
|
||||
log.error("Cannot run application, MainWindow not initialized.")
|
||||
|
||||
def run(self):
|
||||
"""Shows the main window."""
|
||||
if self.main_window:
|
||||
self.main_window.show()
|
||||
log.info("Application started. Showing main window.")
|
||||
else:
|
||||
log.error("Cannot run application, MainWindow not initialized.")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
parser = setup_arg_parser()
|
||||
@ -567,58 +558,9 @@ if __name__ == "__main__":
|
||||
log.info("No required CLI arguments detected, starting GUI mode.")
|
||||
# --- Run the GUI Application ---
|
||||
try:
|
||||
user_config_path = app_setup_utils.read_saved_user_config_path()
|
||||
log.debug(f"Read saved user config path: {user_config_path}")
|
||||
|
||||
first_run_needed = False
|
||||
if user_config_path is None or not user_config_path.strip():
|
||||
log.info("No saved user config path found. First run setup needed.")
|
||||
first_run_needed = True
|
||||
else:
|
||||
user_config_dir = Path(user_config_path)
|
||||
marker_file = app_setup_utils.get_first_run_marker_file(user_config_path)
|
||||
if not user_config_dir.is_dir():
|
||||
log.warning(f"Saved user config directory does not exist: {user_config_path}. First run setup needed.")
|
||||
first_run_needed = True
|
||||
elif not Path(marker_file).is_file():
|
||||
log.warning(f"First run marker file not found in {user_config_path}. First run setup needed.")
|
||||
first_run_needed = True
|
||||
else:
|
||||
log.info(f"Saved user config path found and valid: {user_config_path}. Marker file exists.")
|
||||
|
||||
qt_app = None
|
||||
if first_run_needed:
|
||||
log.info("Initiating first-time setup dialog.")
|
||||
# Need a QApplication instance to show the dialog
|
||||
qt_app = QApplication.instance()
|
||||
if qt_app is None:
|
||||
qt_app = QApplication(sys.argv)
|
||||
|
||||
dialog = FirstTimeSetupDialog()
|
||||
if dialog.exec() == QDialog.Accepted:
|
||||
user_config_path = dialog.get_chosen_path()
|
||||
log.info(f"First-time setup completed. Chosen path: {user_config_path}")
|
||||
# The dialog should have already saved the path and created the marker file
|
||||
else:
|
||||
log.info("First-time setup cancelled by user. Exiting application.")
|
||||
sys.exit(0) # Exit gracefully
|
||||
|
||||
# If qt_app was created for the dialog, reuse it. Otherwise, create it now.
|
||||
if qt_app is None:
|
||||
qt_app = QApplication.instance()
|
||||
if qt_app is None:
|
||||
qt_app = QApplication(sys.argv)
|
||||
|
||||
|
||||
# Ensure user_config_path is set before initializing App
|
||||
if not user_config_path or not Path(user_config_path).is_dir():
|
||||
log.error(f"Fatal: User config path is invalid or not set after setup: {user_config_path}. Cannot proceed.")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
app_instance = App(user_config_path) # Pass the determined path
|
||||
# Load an initial preset after App initialization to set up config, engine, and GUI
|
||||
app_instance.load_preset("_template")
|
||||
app_instance = App()
|
||||
app_instance.run()
|
||||
|
||||
sys.exit(qt_app.exec())
|
||||
|
||||
@ -64,7 +64,7 @@ class InitialScalingOutput:
|
||||
@dataclass
|
||||
class SaveVariantsInput:
|
||||
image_data: np.ndarray # Final data (potentially scaled)
|
||||
final_internal_map_type: str # Final internal type (e.g., MAP_ROUGH, MAP_COL-1)
|
||||
internal_map_type: str # Final internal type (e.g., MAP_ROUGH, MAP_COL-1)
|
||||
source_bit_depth_info: List[int]
|
||||
# Configuration needed
|
||||
output_filename_pattern_tokens: Dict[str, Any]
|
||||
|
||||
@ -284,11 +284,11 @@ class PipelineOrchestrator:
|
||||
|
||||
save_input = SaveVariantsInput(
|
||||
image_data=current_image_data,
|
||||
final_internal_map_type=item.map_type_identifier,
|
||||
internal_map_type=item.map_type_identifier,
|
||||
source_bit_depth_info=[item.bit_depth] if item.bit_depth is not None else [8], # Default to 8 if not set
|
||||
output_filename_pattern_tokens=output_filename_tokens,
|
||||
image_resolutions=save_specific_resolutions, # Pass the specific resolution(s)
|
||||
file_type_defs=context.config_obj.get_file_type_definitions_with_examples(),
|
||||
file_type_defs=getattr(context.config_obj, "FILE_TYPE_DEFINITIONS", {}),
|
||||
output_format_8bit=context.config_obj.get_8bit_output_format(),
|
||||
output_format_16bit_primary=context.config_obj.get_16bit_output_formats()[0],
|
||||
output_format_16bit_fallback=context.config_obj.get_16bit_output_formats()[1],
|
||||
@ -378,7 +378,7 @@ class PipelineOrchestrator:
|
||||
# The `image_saving_utils.save_image_variants` will iterate through `context.config_obj.image_resolutions`.
|
||||
save_input = SaveVariantsInput(
|
||||
image_data=current_image_data,
|
||||
final_internal_map_type=processed_data.output_map_type,
|
||||
internal_map_type=processed_data.output_map_type,
|
||||
source_bit_depth_info=processed_data.source_bit_depths,
|
||||
output_filename_pattern_tokens=output_filename_tokens,
|
||||
image_resolutions=context.config_obj.image_resolutions, # Pass all configured resolutions
|
||||
|
||||
@ -97,7 +97,6 @@ class OutputOrganizationStage(ProcessingStage):
|
||||
token_data_variant = {
|
||||
"assetname": asset_name_for_log,
|
||||
"supplier": context.effective_supplier or "DefaultSupplier",
|
||||
"asset_category": context.asset_rule.asset_type, # Used asset_type for asset_category token
|
||||
"maptype": base_map_type,
|
||||
"resolution": variant_resolution_key,
|
||||
"ext": variant_ext,
|
||||
@ -165,7 +164,6 @@ class OutputOrganizationStage(ProcessingStage):
|
||||
token_data = {
|
||||
"assetname": asset_name_for_log,
|
||||
"supplier": context.effective_supplier or "DefaultSupplier",
|
||||
"asset_category": context.asset_rule.asset_type, # Used asset_type for asset_category token
|
||||
"maptype": base_map_type,
|
||||
"resolution": resolution_str,
|
||||
"ext": temp_file_path.suffix.lstrip('.'),
|
||||
@ -218,6 +216,7 @@ class OutputOrganizationStage(ProcessingStage):
|
||||
# --- Handle other statuses (Skipped, Failed, etc.) ---
|
||||
else: # Catches statuses not explicitly handled above
|
||||
logger.debug(f"Asset '{asset_name_for_log}': Skipping map key '{processed_map_key}' (status: '{map_status}') for organization as it's not a recognized final processed state or variant state.")
|
||||
continue
|
||||
else:
|
||||
logger.debug(f"Asset '{asset_name_for_log}': No processed individual maps to organize.")
|
||||
|
||||
@ -246,7 +245,6 @@ class OutputOrganizationStage(ProcessingStage):
|
||||
base_token_data = {
|
||||
"assetname": asset_name_for_log,
|
||||
"supplier": context.effective_supplier or "DefaultSupplier",
|
||||
"asset_category": context.asset_rule.asset_type, # Used asset_type for asset_category token
|
||||
# Add other tokens if your output_directory_pattern uses them at the asset level
|
||||
"incrementingvalue": getattr(context, 'incrementing_value', None),
|
||||
"sha5": getattr(context, 'sha5_value', None)
|
||||
|
||||
@ -178,20 +178,12 @@ class RegularMapProcessorStage(ProcessingStage):
|
||||
log.debug(f"{log_prefix}: Loaded image {result.original_dimensions[0]}x{result.original_dimensions[1]}.")
|
||||
|
||||
# Get original bit depth
|
||||
# Determine original bit depth from the loaded image data's dtype
|
||||
dtype_to_bit_depth = {
|
||||
np.dtype('uint8'): 8,
|
||||
np.dtype('uint16'): 16,
|
||||
np.dtype('float32'): 32,
|
||||
np.dtype('int8'): 8,
|
||||
np.dtype('int16'): 16,
|
||||
}
|
||||
result.original_bit_depth = dtype_to_bit_depth.get(source_image_data.dtype)
|
||||
|
||||
if result.original_bit_depth is None:
|
||||
log.warning(f"{log_prefix}: Unknown dtype {source_image_data.dtype} for loaded image data, cannot determine bit depth. Setting to None.")
|
||||
else:
|
||||
log.info(f"{log_prefix}: Determined source bit depth from loaded data dtype: {result.original_bit_depth}")
|
||||
try:
|
||||
result.original_bit_depth = ipu.get_image_bit_depth(str(source_file_path_found))
|
||||
log.info(f"{log_prefix}: Determined source bit depth: {result.original_bit_depth}")
|
||||
except Exception as e:
|
||||
log.warning(f"{log_prefix}: Could not determine source bit depth for {source_file_path_found}: {e}. Setting to None.")
|
||||
result.original_bit_depth = None # Indicate failure to determine
|
||||
|
||||
# --- Apply Transformations ---
|
||||
transformed_image_data, final_map_type, transform_notes = ipu.apply_common_map_transformations(
|
||||
@ -205,11 +197,6 @@ class RegularMapProcessorStage(ProcessingStage):
|
||||
result.final_internal_map_type = final_map_type # Update if Gloss->Rough changed it
|
||||
result.transformations_applied = transform_notes
|
||||
|
||||
# Log dtype and shape after transformations
|
||||
log.info(f"{log_prefix}: Image data dtype after transformations: {transformed_image_data.dtype}, shape: {transformed_image_data.shape}")
|
||||
bit_depth_after_transform = dtype_to_bit_depth.get(transformed_image_data.dtype)
|
||||
log.info(f"{log_prefix}: Determined bit depth after transformations: {bit_depth_after_transform}")
|
||||
|
||||
# --- Determine Resolution Key for LOWRES ---
|
||||
if config.enable_low_resolution_fallback and result.original_dimensions:
|
||||
w, h = result.original_dimensions
|
||||
@ -221,7 +208,6 @@ class RegularMapProcessorStage(ProcessingStage):
|
||||
result.status = "Processed"
|
||||
result.error_message = None
|
||||
log.info(f"{log_prefix}: Successfully processed regular map. Final type: '{result.final_internal_map_type}', ResolutionKey: {result.resolution_key}.")
|
||||
log.debug(f"{log_prefix}: Processed image data dtype before returning: {result.processed_image_data.dtype}, shape: {result.processed_image_data.shape}")
|
||||
|
||||
except Exception as e:
|
||||
log.exception(f"{log_prefix}: Unhandled exception during processing: {e}")
|
||||
|
||||
@ -22,7 +22,7 @@ class SaveVariantsStage(ProcessingStage):
|
||||
"""
|
||||
Calls isu.save_image_variants with data from input_data.
|
||||
"""
|
||||
internal_map_type = input_data.final_internal_map_type
|
||||
internal_map_type = input_data.internal_map_type
|
||||
# The input_data for SaveVariantsStage doesn't directly contain the ProcessingItem.
|
||||
# It receives data *derived* from a ProcessingItem by previous stages.
|
||||
# For debugging, we'd need to pass more context or rely on what's in output_filename_pattern_tokens.
|
||||
@ -59,7 +59,7 @@ class SaveVariantsStage(ProcessingStage):
|
||||
|
||||
save_args = {
|
||||
"source_image_data": input_data.image_data,
|
||||
"final_internal_map_type": input_data.final_internal_map_type, # Pass the internal type identifier
|
||||
"base_map_type": base_map_type_friendly, # Use the friendly type
|
||||
"source_bit_depth_info": input_data.source_bit_depth_info,
|
||||
"image_resolutions": input_data.image_resolutions,
|
||||
"file_type_defs": input_data.file_type_defs,
|
||||
|
||||
@ -304,11 +304,9 @@ def load_image(image_path: Union[str, Path], read_flag: int = cv2.IMREAD_UNCHANG
|
||||
try:
|
||||
img = cv2.imread(str(image_path), read_flag)
|
||||
if img is None:
|
||||
ipu_log.warning(f"Failed to load image: {image_path}")
|
||||
# print(f"Warning: Failed to load image: {image_path}") # Optional: for debugging utils
|
||||
return None
|
||||
|
||||
ipu_log.debug(f"Loaded image '{image_path}'. Initial dtype: {img.dtype}, shape: {img.shape}")
|
||||
|
||||
# Ensure RGB/RGBA for color images
|
||||
if len(img.shape) == 3:
|
||||
if img.shape[2] == 4: # BGRA from OpenCV
|
||||
@ -394,11 +392,8 @@ def save_image(
|
||||
path_obj = Path(image_path)
|
||||
path_obj.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
ipu_log.debug(f"Saving image '{path_obj}'. Initial data dtype: {img_to_save.dtype}, shape: {img_to_save.shape}")
|
||||
|
||||
# 1. Data Type Conversion
|
||||
if output_dtype_target is not None:
|
||||
ipu_log.debug(f"Attempting to convert image data to target dtype: {output_dtype_target}")
|
||||
if output_dtype_target == np.uint8 and img_to_save.dtype != np.uint8:
|
||||
if img_to_save.dtype == np.uint16: img_to_save = (img_to_save.astype(np.float32) / 65535.0 * 255.0).astype(np.uint8)
|
||||
elif img_to_save.dtype in [np.float16, np.float32, np.float64]: img_to_save = (np.clip(img_to_save, 0.0, 1.0) * 255.0).astype(np.uint8)
|
||||
@ -418,8 +413,6 @@ def save_image(
|
||||
elif img_to_save.dtype == np.float16: img_to_save = img_to_save.astype(np.float32)
|
||||
|
||||
|
||||
ipu_log.debug(f"Saving image '{path_obj}'. Data dtype after conversion attempt: {img_to_save.dtype}, shape: {img_to_save.shape}")
|
||||
|
||||
# 2. Color Space Conversion (Internal RGB/RGBA -> BGR/BGRA for OpenCV)
|
||||
# Input `image_data` is assumed to be in RGB/RGBA format (due to `load_image` changes).
|
||||
# OpenCV's `imwrite` typically expects BGR/BGRA for formats like PNG, JPG.
|
||||
@ -467,8 +460,6 @@ def apply_common_map_transformations(
|
||||
current_image_data = image_data # Start with original data
|
||||
updated_processing_map_type = processing_map_type # Start with original type
|
||||
|
||||
ipu_log.debug(f"{log_prefix}: apply_common_map_transformations - Initial image data dtype: {current_image_data.dtype}, shape: {current_image_data.shape}")
|
||||
|
||||
# Gloss-to-Rough
|
||||
# Check if the base type is Gloss (before suffix)
|
||||
base_map_type_match = re.match(r"(MAP_GLOSS)", processing_map_type)
|
||||
@ -503,8 +494,6 @@ def apply_common_map_transformations(
|
||||
current_image_data = invert_normal_map_green_channel(current_image_data)
|
||||
transformation_notes.append("Normal Green Inverted (Global)")
|
||||
|
||||
ipu_log.debug(f"{log_prefix}: apply_common_map_transformations - Final image data dtype: {current_image_data.dtype}, shape: {current_image_data.shape}")
|
||||
|
||||
return current_image_data, updated_processing_map_type, transformation_notes
|
||||
|
||||
# --- Normal Map Utilities ---
|
||||
|
||||
@ -4,9 +4,6 @@ import numpy as np
|
||||
from pathlib import Path
|
||||
from typing import List, Dict, Any, Tuple, Optional
|
||||
|
||||
# Import necessary utility functions
|
||||
from utils.path_utils import get_filename_friendly_map_type # Import the function
|
||||
|
||||
# Potentially import ipu from ...utils import image_processing_utils as ipu
|
||||
# Assuming ipu is available in the same utils directory or parent
|
||||
try:
|
||||
@ -25,7 +22,7 @@ logger = logging.getLogger(__name__)
|
||||
|
||||
def save_image_variants(
|
||||
source_image_data: np.ndarray,
|
||||
final_internal_map_type: str, # Use the internal map type identifier
|
||||
base_map_type: str, # Filename-friendly map type
|
||||
source_bit_depth_info: List[Optional[int]],
|
||||
image_resolutions: Dict[str, int],
|
||||
file_type_defs: Dict[str, Dict[str, Any]],
|
||||
@ -45,13 +42,14 @@ def save_image_variants(
|
||||
|
||||
Args:
|
||||
source_image_data (np.ndarray): High-res image data (in memory, potentially transformed).
|
||||
final_internal_map_type (str): Final internal map type (e.g., "MAP_COL", "MAP_NRM", "MAP_NRMRGH").
|
||||
base_map_type (str): Final map type (e.g., "COL", "ROUGH", "NORMAL", "MAP_NRMRGH").
|
||||
This is the filename-friendly map type.
|
||||
source_bit_depth_info (List[Optional[int]]): List of original source bit depth(s)
|
||||
(e.g., [8], [16], [8, 16]). Can contain None.
|
||||
image_resolutions (Dict[str, int]): Dictionary mapping resolution keys (e.g., "4K")
|
||||
to max dimensions (e.g., 4096).
|
||||
file_type_defs (Dict[str, Dict[str, Any]]): Dictionary defining properties for map types,
|
||||
including 'bit_depth_policy'.
|
||||
including 'bit_depth_rule'.
|
||||
output_format_8bit (str): File extension for 8-bit output (e.g., "jpg", "png").
|
||||
output_format_16bit_primary (str): Primary file extension for 16-bit output (e.g., "png", "tif").
|
||||
output_format_16bit_fallback (str): Fallback file extension for 16-bit output.
|
||||
@ -78,46 +76,30 @@ def save_image_variants(
|
||||
source_max_dim = max(source_h, source_w)
|
||||
|
||||
# 1. Use provided configuration inputs (already available as function arguments)
|
||||
logger.info(f"SaveImageVariants: Starting for map type: {final_internal_map_type}. Source shape: {source_image_data.shape}, Source bit depths: {source_bit_depth_info}")
|
||||
logger.info(f"SaveImageVariants: Starting for map type: {base_map_type}. Source shape: {source_image_data.shape}, Source bit depths: {source_bit_depth_info}")
|
||||
logger.debug(f"SaveImageVariants: Resolutions: {image_resolutions}, File Type Defs: {file_type_defs.keys()}, Output Formats: 8bit={output_format_8bit}, 16bit_pri={output_format_16bit_primary}, 16bit_fall={output_format_16bit_fallback}")
|
||||
logger.debug(f"SaveImageVariants: PNG Comp: {png_compression_level}, JPG Qual: {jpg_quality}")
|
||||
logger.debug(f"SaveImageVariants: Output Tokens: {output_filename_pattern_tokens}, Output Pattern: {output_filename_pattern}")
|
||||
logger.debug(f"SaveImageVariants: Received resolution_threshold_for_jpg: {resolution_threshold_for_jpg}") # Log received threshold
|
||||
|
||||
# 2. Determine Target Bit Depth based on bit_depth_policy
|
||||
# Use the final_internal_map_type for lookup in file_type_defs
|
||||
bit_depth_policy = file_type_defs.get(final_internal_map_type, {}).get('bit_depth_policy', '')
|
||||
# 2. Determine Target Bit Depth
|
||||
target_bit_depth = 8 # Default
|
||||
bit_depth_rule = file_type_defs.get(base_map_type, {}).get('bit_depth_rule', 'force_8bit')
|
||||
if bit_depth_rule not in ['force_8bit', 'respect_inputs']:
|
||||
logger.warning(f"Unknown bit_depth_rule '{bit_depth_rule}' for map type '{base_map_type}'. Defaulting to 'force_8bit'.")
|
||||
bit_depth_rule = 'force_8bit'
|
||||
|
||||
logger.info(f"SaveImageVariants: Determining target bit depth for map type: {final_internal_map_type} with policy: '{bit_depth_policy}'. Source bit depths: {source_bit_depth_info}")
|
||||
|
||||
if bit_depth_policy == "force_8bit":
|
||||
target_bit_depth = 8
|
||||
logger.debug(f"SaveImageVariants: Policy 'force_8bit' applied. Target bit depth: {target_bit_depth}")
|
||||
elif bit_depth_policy == "force_16bit":
|
||||
target_bit_depth = 16
|
||||
logger.debug(f"SaveImageVariants: Policy 'force_16bit' applied. Target bit depth: {target_bit_depth}")
|
||||
elif bit_depth_policy == "preserve":
|
||||
if bit_depth_rule == 'respect_inputs':
|
||||
# Check if any source bit depth is > 8, ignoring None
|
||||
if any(depth is not None and depth > 8 for depth in source_bit_depth_info):
|
||||
target_bit_depth = 16
|
||||
logger.debug(f"SaveImageVariants: Policy 'preserve' applied, source > 8 found. Setting target_bit_depth = {target_bit_depth}")
|
||||
else:
|
||||
target_bit_depth = 8
|
||||
logger.debug(f"SaveImageVariants: Policy 'preserve' applied, no source > 8 found. Setting target_bit_depth = {target_bit_depth}")
|
||||
elif bit_depth_policy == "" or bit_depth_policy not in ["force_8bit", "force_16bit", "preserve"]:
|
||||
# Handle "" policy or any other unexpected/unknown value
|
||||
# For unknown/empty policies, apply the 'preserve' logic based on source bit depths.
|
||||
if bit_depth_policy == "":
|
||||
logger.warning(f"Empty bit_depth_policy for map type '{final_internal_map_type}'. Applying 'preserve' logic.")
|
||||
else:
|
||||
logger.warning(f"Unknown bit_depth_policy '{bit_depth_policy}' for map type '{final_internal_map_type}'. Applying 'preserve' logic.")
|
||||
logger.info(f"Bit depth rule 'respect_inputs' applied. Source bit depths: {source_bit_depth_info}. Target bit depth: {target_bit_depth}")
|
||||
else: # force_8bit
|
||||
target_bit_depth = 8
|
||||
logger.info(f"Bit depth rule 'force_8bit' applied. Target bit depth: {target_bit_depth}")
|
||||
|
||||
if any(depth is not None and depth > 8 for depth in source_bit_depth_info):
|
||||
target_bit_depth = 16
|
||||
logger.debug(f"SaveImageVariants: Applying 'preserve' logic, source > 8 found. Setting target_bit_depth = {target_bit_depth}")
|
||||
else:
|
||||
target_bit_depth = 8
|
||||
logger.debug(f"SaveImageVariants: Applying 'preserve' logic, no source > 8 found. Setting target_bit_depth = {target_bit_depth}")
|
||||
|
||||
# 3. Determine Output File Format(s)
|
||||
if target_bit_depth == 8:
|
||||
@ -136,27 +118,26 @@ def save_image_variants(
|
||||
|
||||
current_output_ext = output_ext # Store the initial extension based on bit depth
|
||||
|
||||
# Move this logging statement AFTER current_output_ext is assigned
|
||||
logger.info(f"SaveImageVariants: Final determined target bit depth: {target_bit_depth}, Initial output format: {current_output_ext} for map type {final_internal_map_type}")
|
||||
logger.info(f"SaveImageVariants: Determined target bit depth: {target_bit_depth}, Initial output format: {current_output_ext} for map type {base_map_type}")
|
||||
|
||||
# 4. Generate and Save Resolution Variants
|
||||
# Sort resolutions by max dimension descending
|
||||
sorted_resolutions = sorted(image_resolutions.items(), key=lambda item: item[1], reverse=True)
|
||||
|
||||
for res_key, res_max_dim in sorted_resolutions:
|
||||
logger.info(f"SaveImageVariants: Processing variant {res_key} ({res_max_dim}px) for {final_internal_map_type}")
|
||||
logger.info(f"SaveImageVariants: Processing variant {res_key} ({res_max_dim}px) for {base_map_type}")
|
||||
|
||||
# --- Prevent Upscaling ---
|
||||
# Skip this resolution variant if its target dimension is larger than the source image's largest dimension.
|
||||
if res_max_dim > source_max_dim:
|
||||
logger.info(f"SaveImageVariants: Skipping variant {res_key} ({res_max_dim}px) for {final_internal_map_type} because target resolution is larger than source ({source_max_dim}px).")
|
||||
logger.info(f"SaveImageVariants: Skipping variant {res_key} ({res_max_dim}px) for {base_map_type} because target resolution is larger than source ({source_max_dim}px).")
|
||||
continue # Skip to the next resolution
|
||||
|
||||
# Calculate target dimensions for valid variants (equal or smaller than source)
|
||||
if source_max_dim == res_max_dim:
|
||||
# Use source dimensions if target is equal
|
||||
target_w_res, target_h_res = source_w, source_h
|
||||
logger.info(f"SaveImageVariants: Using source resolution ({source_w}x{source_h}) for {res_key} variant of {final_internal_map_type} as target matches source.")
|
||||
logger.info(f"SaveImageVariants: Using source resolution ({source_w}x{source_h}) for {res_key} variant of {base_map_type} as target matches source.")
|
||||
else: # Downscale (source_max_dim > res_max_dim)
|
||||
# Downscale, maintaining aspect ratio
|
||||
aspect_ratio = source_w / source_h
|
||||
@ -166,14 +147,14 @@ def save_image_variants(
|
||||
else:
|
||||
target_h_res = res_max_dim
|
||||
target_w_res = max(1, int(res_max_dim * aspect_ratio)) # Ensure width is at least 1
|
||||
logger.info(f"SaveImageVariants: Calculated downscale for {final_internal_map_type} {res_key}: from ({source_w}x{source_h}) to ({target_w_res}x{target_h_res})")
|
||||
logger.info(f"SaveImageVariants: Calculated downscale for {base_map_type} {res_key}: from ({source_w}x{source_h}) to ({target_w_res}x{target_h_res})")
|
||||
|
||||
|
||||
# Resize source_image_data (only if necessary)
|
||||
if (target_w_res, target_h_res) == (source_w, source_h):
|
||||
# No resize needed if dimensions match
|
||||
variant_data = source_image_data.copy() # Copy to avoid modifying original if needed later
|
||||
logger.debug(f"SaveImageVariants: No resize needed for {final_internal_map_type} {res_key}, using copy of source data.")
|
||||
logger.debug(f"SaveImageVariants: No resize needed for {base_map_type} {res_key}, using copy of source data.")
|
||||
else:
|
||||
# Perform resize only if dimensions differ (i.e., downscaling)
|
||||
interpolation_method = cv2.INTER_AREA # Good for downscaling
|
||||
@ -181,22 +162,21 @@ def save_image_variants(
|
||||
variant_data = ipu.resize_image(source_image_data, target_w_res, target_h_res, interpolation=interpolation_method)
|
||||
if variant_data is None: # Check if resize failed
|
||||
raise ValueError("ipu.resize_image returned None")
|
||||
logger.debug(f"SaveImageVariants: Resized variant data shape for {final_internal_map_type} {res_key}: {variant_data.shape}")
|
||||
logger.debug(f"SaveImageVariants: Resized variant data shape for {base_map_type} {res_key}: {variant_data.shape}")
|
||||
except Exception as e:
|
||||
logger.error(f"SaveImageVariants: Error resizing image for {final_internal_map_type} {res_key} variant: {e}")
|
||||
logger.error(f"SaveImageVariants: Error resizing image for {base_map_type} {res_key} variant: {e}")
|
||||
continue # Skip this variant if resizing fails
|
||||
|
||||
# Filename Construction
|
||||
current_tokens = output_filename_pattern_tokens.copy()
|
||||
# Use the filename-friendly version for the filename token
|
||||
current_tokens['maptype'] = get_filename_friendly_map_type(final_internal_map_type, file_type_defs)
|
||||
current_tokens['maptype'] = base_map_type
|
||||
current_tokens['resolution'] = res_key
|
||||
|
||||
# Determine final extension for this variant, considering JPG threshold
|
||||
final_variant_ext = current_output_ext
|
||||
|
||||
# --- Start JPG Threshold Logging ---
|
||||
logger.debug(f"SaveImageVariants: JPG Threshold Check for {final_internal_map_type} {res_key}:")
|
||||
logger.debug(f"SaveImageVariants: JPG Threshold Check for {base_map_type} {res_key}:")
|
||||
logger.debug(f" - target_bit_depth: {target_bit_depth}")
|
||||
logger.debug(f" - resolution_threshold_for_jpg: {resolution_threshold_for_jpg}")
|
||||
logger.debug(f" - target_w_res: {target_w_res}, target_h_res: {target_h_res}")
|
||||
@ -218,7 +198,7 @@ def save_image_variants(
|
||||
|
||||
if cond_bit_depth and cond_threshold_not_none and cond_res_exceeded and cond_is_png:
|
||||
final_variant_ext = 'jpg'
|
||||
logger.info(f"SaveImageVariants: Overriding 8-bit PNG to JPG for {final_internal_map_type} {res_key} due to resolution {max(target_w_res, target_h_res)}px > threshold {resolution_threshold_for_jpg}px.")
|
||||
logger.info(f"SaveImageVariants: Overriding 8-bit PNG to JPG for {base_map_type} {res_key} due to resolution {max(target_w_res, target_h_res)}px > threshold {resolution_threshold_for_jpg}px.")
|
||||
|
||||
current_tokens['ext'] = final_variant_ext
|
||||
|
||||
@ -236,14 +216,14 @@ def save_image_variants(
|
||||
continue # Skip this variant
|
||||
|
||||
output_path = output_base_directory / filename
|
||||
logger.info(f"SaveImageVariants: Constructed output path for {final_internal_map_type} {res_key}: {output_path}")
|
||||
logger.info(f"SaveImageVariants: Constructed output path for {base_map_type} {res_key}: {output_path}")
|
||||
|
||||
# Ensure parent directory exists
|
||||
output_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
logger.debug(f"SaveImageVariants: Ensured directory exists for {final_internal_map_type} {res_key}: {output_path.parent}")
|
||||
logger.debug(f"SaveImageVariants: Ensured directory exists for {base_map_type} {res_key}: {output_path.parent}")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"SaveImageVariants: Error constructing filepath for {final_internal_map_type} {res_key} variant: {e}")
|
||||
logger.error(f"SaveImageVariants: Error constructing filepath for {base_map_type} {res_key} variant: {e}")
|
||||
continue # Skip this variant if path construction fails
|
||||
|
||||
|
||||
@ -252,11 +232,11 @@ def save_image_variants(
|
||||
if final_variant_ext == 'jpg': # Check against final_variant_ext
|
||||
save_params_cv2.append(cv2.IMWRITE_JPEG_QUALITY)
|
||||
save_params_cv2.append(jpg_quality)
|
||||
logger.debug(f"SaveImageVariants: Using JPG quality: {jpg_quality} for {final_internal_map_type} {res_key}")
|
||||
logger.debug(f"SaveImageVariants: Using JPG quality: {jpg_quality} for {base_map_type} {res_key}")
|
||||
elif final_variant_ext == 'png': # Check against final_variant_ext
|
||||
save_params_cv2.append(cv2.IMWRITE_PNG_COMPRESSION)
|
||||
save_params_cv2.append(png_compression_level)
|
||||
logger.debug(f"SaveImageVariants: Using PNG compression level: {png_compression_level} for {final_internal_map_type} {res_key}")
|
||||
logger.debug(f"SaveImageVariants: Using PNG compression level: {png_compression_level} for {base_map_type} {res_key}")
|
||||
# Add other format specific parameters if needed (e.g., TIFF compression)
|
||||
|
||||
|
||||
@ -277,8 +257,7 @@ def save_image_variants(
|
||||
# Saving
|
||||
try:
|
||||
# ipu.save_image is expected to handle the actual cv2.imwrite call
|
||||
logger.debug(f"SaveImageVariants: Preparing to save {final_internal_map_type} {res_key}. Data dtype: {image_data_for_save.dtype}, shape: {image_data_for_save.shape}. Target dtype for ipu.save_image: {output_dtype_for_save}")
|
||||
logger.debug(f"SaveImageVariants: Attempting to save {final_internal_map_type} {res_key} to {output_path} with params {save_params_cv2}, target_dtype: {output_dtype_for_save}")
|
||||
logger.debug(f"SaveImageVariants: Attempting to save {base_map_type} {res_key} to {output_path} with params {save_params_cv2}, target_dtype: {output_dtype_for_save}")
|
||||
success = ipu.save_image(
|
||||
str(output_path),
|
||||
image_data_for_save,
|
||||
@ -286,7 +265,7 @@ def save_image_variants(
|
||||
params=save_params_cv2
|
||||
)
|
||||
if success:
|
||||
logger.info(f"SaveImageVariants: Successfully saved {final_internal_map_type} {res_key} variant to {output_path}")
|
||||
logger.info(f"SaveImageVariants: Successfully saved {base_map_type} {res_key} variant to {output_path}")
|
||||
# Collect details for the returned list
|
||||
saved_file_details.append({
|
||||
'path': str(output_path),
|
||||
@ -296,10 +275,10 @@ def save_image_variants(
|
||||
'dimensions': (target_w_res, target_h_res)
|
||||
})
|
||||
else:
|
||||
logger.error(f"SaveImageVariants: Failed to save {final_internal_map_type} {res_key} variant to {output_path} (ipu.save_image returned False)")
|
||||
logger.error(f"SaveImageVariants: Failed to save {base_map_type} {res_key} variant to {output_path} (ipu.save_image returned False)")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"SaveImageVariants: Error during ipu.save_image for {final_internal_map_type} {res_key} variant to {output_path}: {e}", exc_info=True)
|
||||
logger.error(f"SaveImageVariants: Error during ipu.save_image for {base_map_type} {res_key} variant to {output_path}: {e}", exc_info=True)
|
||||
# Continue to next variant even if one fails
|
||||
|
||||
|
||||
@ -309,7 +288,7 @@ def save_image_variants(
|
||||
|
||||
|
||||
# 5. Return List of Saved File Details
|
||||
logger.info(f"Finished saving variants for map type: {final_internal_map_type}. Saved {len(saved_file_details)} variants.")
|
||||
logger.info(f"Finished saving variants for map type: {base_map_type}. Saved {len(saved_file_details)} variants.")
|
||||
return saved_file_details
|
||||
|
||||
# Optional Helper Functions (can be added here if needed)
|
||||
|
||||
@ -12,14 +12,12 @@ class FileRule:
|
||||
channel_merge_instructions: Dict[str, Any] = dataclasses.field(default_factory=dict)
|
||||
output_format_override: str = None
|
||||
processing_items: List['ProcessingItem'] = dataclasses.field(default_factory=list) # Added field
|
||||
parent_asset: 'AssetRule' = None # Added parent back-reference
|
||||
|
||||
def to_json(self) -> str:
|
||||
# Exclude parent_asset to avoid circular references
|
||||
data = dataclasses.asdict(self)
|
||||
if 'parent_asset' in data:
|
||||
del data['parent_asset']
|
||||
return json.dumps(data, indent=4)
|
||||
# Need to handle ProcessingItem serialization if it contains non-serializable types like np.ndarray
|
||||
# For now, assume asdict handles it or it's handled before calling to_json for persistence.
|
||||
# A custom asdict_factory might be needed for robust serialization.
|
||||
return json.dumps(dataclasses.asdict(self), indent=4)
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, json_string: str) -> 'FileRule':
|
||||
@ -33,14 +31,9 @@ class AssetRule:
|
||||
asset_type_override: str = None
|
||||
common_metadata: Dict[str, Any] = dataclasses.field(default_factory=dict)
|
||||
files: List[FileRule] = dataclasses.field(default_factory=list)
|
||||
parent_source: 'SourceRule' = None # Added parent back-reference
|
||||
|
||||
def to_json(self) -> str:
|
||||
# Exclude parent_source to avoid circular references
|
||||
data = dataclasses.asdict(self)
|
||||
if 'parent_source' in data:
|
||||
del data['parent_source']
|
||||
return json.dumps(data, indent=4)
|
||||
return json.dumps(dataclasses.asdict(self), indent=4)
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, json_string: str) -> 'AssetRule':
|
||||
|
||||
@ -1 +0,0 @@
|
||||
Asset Processor first-time setup complete.
|
||||
@ -1,280 +0,0 @@
|
||||
{
|
||||
"preset_name": "Dinesen",
|
||||
"supplier_name": "Dinesen",
|
||||
"notes": "Preset for standard Poliigon downloads. Prioritizes _xxx16 files. Moves previews etc. to Extra/. Assumes Metal/Rough workflow.",
|
||||
"source_naming": {
|
||||
"separator": "_",
|
||||
"part_indices": {
|
||||
"base_name": 0,
|
||||
"map_type": 1
|
||||
},
|
||||
"glossiness_keywords": [
|
||||
"GLOSS"
|
||||
]
|
||||
},
|
||||
"move_to_extra_patterns": [
|
||||
"*_Preview*",
|
||||
"*_Sphere*",
|
||||
"*_Cube*",
|
||||
"*_Flat*",
|
||||
"*.txt",
|
||||
"*.pdf",
|
||||
"*.url",
|
||||
"*.htm*",
|
||||
"*_Fabric.*",
|
||||
"*_DISP_*METALNESS*"
|
||||
],
|
||||
"map_type_mapping": [
|
||||
{
|
||||
"target_type": "MAP_COL",
|
||||
"keywords": [
|
||||
"COLOR*",
|
||||
"COL",
|
||||
"COL-*",
|
||||
"DIFFUSE",
|
||||
"DIF",
|
||||
"ALBEDO"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_NRM",
|
||||
"keywords": [
|
||||
"NORMAL*",
|
||||
"NORM*",
|
||||
"NRM*",
|
||||
"N"
|
||||
],
|
||||
"priority_keywords": [
|
||||
"*_NRM16*",
|
||||
"*_NM16*",
|
||||
"*Normal16*"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_ROUGH",
|
||||
"keywords": [
|
||||
"ROUGHNESS",
|
||||
"ROUGH"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_GLOSS",
|
||||
"keywords": [
|
||||
"GLOSS"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_AO",
|
||||
"keywords": [
|
||||
"AMBIENTOCCLUSION",
|
||||
"AO"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_DISP",
|
||||
"keywords": [
|
||||
"DISPLACEMENT",
|
||||
"DISP",
|
||||
"HEIGHT",
|
||||
"BUMP"
|
||||
],
|
||||
"priority_keywords": [
|
||||
"*_DISP16*",
|
||||
"*_DSP16*",
|
||||
"*DSP16*",
|
||||
"*DISP16*",
|
||||
"*Displacement16*",
|
||||
"*Height16*"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_REFL",
|
||||
"keywords": [
|
||||
"REFLECTION",
|
||||
"REFL",
|
||||
"SPECULAR",
|
||||
"SPEC"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_SSS",
|
||||
"keywords": [
|
||||
"SSS",
|
||||
"SUBSURFACE*"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_FUZZ",
|
||||
"keywords": [
|
||||
"FUZZ"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_IDMAP",
|
||||
"keywords": [
|
||||
"IDMAP"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_MASK",
|
||||
"keywords": [
|
||||
"OPAC*",
|
||||
"TRANSP*",
|
||||
"MASK*",
|
||||
"ALPHA*"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_METAL",
|
||||
"keywords": [
|
||||
"METAL*",
|
||||
"METALLIC"
|
||||
]
|
||||
}
|
||||
],
|
||||
"asset_category_rules": {
|
||||
"model_patterns": [
|
||||
"*.fbx",
|
||||
"*.obj",
|
||||
"*.blend",
|
||||
"*.mtl"
|
||||
],
|
||||
"decal_keywords": [
|
||||
"Decal"
|
||||
]
|
||||
},
|
||||
"archetype_rules": [
|
||||
[
|
||||
"Foliage",
|
||||
{
|
||||
"match_any": [
|
||||
"Plant",
|
||||
"Leaf",
|
||||
"Leaves",
|
||||
"Grass"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Fabric",
|
||||
{
|
||||
"match_any": [
|
||||
"Fabric",
|
||||
"Carpet",
|
||||
"Cloth",
|
||||
"Textile",
|
||||
"Leather"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Wood",
|
||||
{
|
||||
"match_any": [
|
||||
"Wood",
|
||||
"Timber",
|
||||
"Plank",
|
||||
"Board"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Metal",
|
||||
{
|
||||
"match_any": [
|
||||
"_Metal",
|
||||
"Steel",
|
||||
"Iron",
|
||||
"Gold",
|
||||
"Copper",
|
||||
"Chrome",
|
||||
"Aluminum",
|
||||
"Brass",
|
||||
"Bronze"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Concrete",
|
||||
{
|
||||
"match_any": [
|
||||
"Concrete",
|
||||
"Cement"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Ground",
|
||||
{
|
||||
"match_any": [
|
||||
"Ground",
|
||||
"Dirt",
|
||||
"Soil",
|
||||
"Mud",
|
||||
"Sand",
|
||||
"Gravel",
|
||||
"Asphalt",
|
||||
"Road",
|
||||
"Moss"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Stone",
|
||||
{
|
||||
"match_any": [
|
||||
"Stone",
|
||||
"Rock*",
|
||||
"Marble",
|
||||
"Granite",
|
||||
"Brick",
|
||||
"Tile",
|
||||
"Paving",
|
||||
"Pebble*",
|
||||
"Terrazzo",
|
||||
"Slate"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Plaster",
|
||||
{
|
||||
"match_any": [
|
||||
"Plaster",
|
||||
"Stucco",
|
||||
"Wall",
|
||||
"Paint"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Plastic",
|
||||
{
|
||||
"match_any": [
|
||||
"Plastic",
|
||||
"PVC",
|
||||
"Resin",
|
||||
"Rubber"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Glass",
|
||||
{
|
||||
"match_any": [
|
||||
"Glass"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
]
|
||||
]
|
||||
}
|
||||
@ -1,280 +0,0 @@
|
||||
{
|
||||
"preset_name": "Poliigon Standard v2",
|
||||
"supplier_name": "Poliigon",
|
||||
"notes": "Preset for standard Poliigon downloads. Prioritizes _xxx16 files. Moves previews etc. to Extra/. Assumes Metal/Rough workflow.",
|
||||
"source_naming": {
|
||||
"separator": "_",
|
||||
"part_indices": {
|
||||
"base_name": 0,
|
||||
"map_type": 1
|
||||
},
|
||||
"glossiness_keywords": [
|
||||
"GLOSS"
|
||||
]
|
||||
},
|
||||
"move_to_extra_patterns": [
|
||||
"*_Preview*",
|
||||
"*_Sphere*",
|
||||
"*_Cube*",
|
||||
"*_Flat*",
|
||||
"*.txt",
|
||||
"*.pdf",
|
||||
"*.url",
|
||||
"*.htm*",
|
||||
"*_Fabric.*",
|
||||
"*_Albedo*"
|
||||
],
|
||||
"map_type_mapping": [
|
||||
{
|
||||
"target_type": "MAP_COL",
|
||||
"keywords": [
|
||||
"COLOR*",
|
||||
"COL",
|
||||
"COL-*",
|
||||
"DIFFUSE",
|
||||
"DIF",
|
||||
"ALBEDO"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_NRM",
|
||||
"keywords": [
|
||||
"NORMAL*",
|
||||
"NORM*",
|
||||
"NRM*",
|
||||
"N"
|
||||
],
|
||||
"priority_keywords": [
|
||||
"*_NRM16*",
|
||||
"*_NM16*",
|
||||
"*Normal16*"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_ROUGH",
|
||||
"keywords": [
|
||||
"ROUGHNESS",
|
||||
"ROUGH"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_GLOSS",
|
||||
"keywords": [
|
||||
"GLOSS"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_AO",
|
||||
"keywords": [
|
||||
"AMBIENTOCCLUSION",
|
||||
"AO"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_DISP",
|
||||
"keywords": [
|
||||
"DISPLACEMENT",
|
||||
"DISP",
|
||||
"HEIGHT",
|
||||
"BUMP"
|
||||
],
|
||||
"priority_keywords": [
|
||||
"*_DISP16*",
|
||||
"*_DSP16*",
|
||||
"*DSP16*",
|
||||
"*DISP16*",
|
||||
"*Displacement16*",
|
||||
"*Height16*"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_REFL",
|
||||
"keywords": [
|
||||
"REFLECTION",
|
||||
"REFL",
|
||||
"SPECULAR",
|
||||
"SPEC"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_SSS",
|
||||
"keywords": [
|
||||
"SSS",
|
||||
"SUBSURFACE*"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_FUZZ",
|
||||
"keywords": [
|
||||
"FUZZ"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_IDMAP",
|
||||
"keywords": [
|
||||
"IDMAP"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_MASK",
|
||||
"keywords": [
|
||||
"OPAC*",
|
||||
"TRANSP*",
|
||||
"MASK*",
|
||||
"ALPHA*"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_METAL",
|
||||
"keywords": [
|
||||
"METAL*",
|
||||
"METALLIC"
|
||||
]
|
||||
}
|
||||
],
|
||||
"asset_category_rules": {
|
||||
"model_patterns": [
|
||||
"*.fbx",
|
||||
"*.obj",
|
||||
"*.blend",
|
||||
"*.mtl"
|
||||
],
|
||||
"decal_keywords": [
|
||||
"Decal"
|
||||
]
|
||||
},
|
||||
"archetype_rules": [
|
||||
[
|
||||
"Foliage",
|
||||
{
|
||||
"match_any": [
|
||||
"Plant",
|
||||
"Leaf",
|
||||
"Leaves",
|
||||
"Grass"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Fabric",
|
||||
{
|
||||
"match_any": [
|
||||
"Fabric",
|
||||
"Carpet",
|
||||
"Cloth",
|
||||
"Textile",
|
||||
"Leather"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Wood",
|
||||
{
|
||||
"match_any": [
|
||||
"Wood",
|
||||
"Timber",
|
||||
"Plank",
|
||||
"Board"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Metal",
|
||||
{
|
||||
"match_any": [
|
||||
"_Metal",
|
||||
"Steel",
|
||||
"Iron",
|
||||
"Gold",
|
||||
"Copper",
|
||||
"Chrome",
|
||||
"Aluminum",
|
||||
"Brass",
|
||||
"Bronze"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Concrete",
|
||||
{
|
||||
"match_any": [
|
||||
"Concrete",
|
||||
"Cement"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Ground",
|
||||
{
|
||||
"match_any": [
|
||||
"Ground",
|
||||
"Dirt",
|
||||
"Soil",
|
||||
"Mud",
|
||||
"Sand",
|
||||
"Gravel",
|
||||
"Asphalt",
|
||||
"Road",
|
||||
"Moss"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Stone",
|
||||
{
|
||||
"match_any": [
|
||||
"Stone",
|
||||
"Rock*",
|
||||
"Marble",
|
||||
"Granite",
|
||||
"Brick",
|
||||
"Tile",
|
||||
"Paving",
|
||||
"Pebble*",
|
||||
"Terrazzo",
|
||||
"Slate"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Plaster",
|
||||
{
|
||||
"match_any": [
|
||||
"Plaster",
|
||||
"Stucco",
|
||||
"Wall",
|
||||
"Paint"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Plastic",
|
||||
{
|
||||
"match_any": [
|
||||
"Plastic",
|
||||
"PVC",
|
||||
"Resin",
|
||||
"Rubber"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Glass",
|
||||
{
|
||||
"match_any": [
|
||||
"Glass"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
]
|
||||
]
|
||||
}
|
||||
@ -1,270 +0,0 @@
|
||||
{
|
||||
"preset_name": "Poliigon Standard v2",
|
||||
"supplier_name": "Poliigon",
|
||||
"notes": "Preset for standard Poliigon downloads. Prioritizes _xxx16 files. Moves previews etc. to Extra/. Assumes Metal/Rough workflow.",
|
||||
"source_naming": {
|
||||
"separator": "_",
|
||||
"part_indices": {
|
||||
"base_name": 0,
|
||||
"map_type": 1
|
||||
},
|
||||
"glossiness_keywords": [
|
||||
"GLOSS"
|
||||
],
|
||||
"bit_depth_variants": {
|
||||
"NRM": "*_NRM16*",
|
||||
"DISP": "*_DISP16*"
|
||||
}
|
||||
},
|
||||
"move_to_extra_patterns": [
|
||||
"*_Preview*",
|
||||
"*_Sphere*",
|
||||
"*_Cube*",
|
||||
"*_Flat*",
|
||||
"*.txt",
|
||||
"*.pdf",
|
||||
"*.url",
|
||||
"*.htm*",
|
||||
"*_Fabric.*"
|
||||
],
|
||||
"map_type_mapping": [
|
||||
{
|
||||
"target_type": "MAP_COL",
|
||||
"keywords": [
|
||||
"COLOR*",
|
||||
"COL",
|
||||
"DIFFUSE",
|
||||
"DIF",
|
||||
"ALBEDO"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_NRM",
|
||||
"keywords": [
|
||||
"NORMAL*",
|
||||
"NORM*",
|
||||
"NRM*",
|
||||
"N"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_ROUGH",
|
||||
"keywords": [
|
||||
"ROUGHNESS",
|
||||
"ROUGH"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_ROUGH",
|
||||
"keywords": [
|
||||
"GLOSS"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_AO",
|
||||
"keywords": [
|
||||
"AMBIENTOCCLUSION",
|
||||
"AO"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_DISP",
|
||||
"keywords": [
|
||||
"DISPLACEMENT",
|
||||
"DISP",
|
||||
"HEIGHT",
|
||||
"BUMP"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_REFL",
|
||||
"keywords": [
|
||||
"REFLECTION",
|
||||
"REFL",
|
||||
"SPECULAR",
|
||||
"SPEC"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_SSS",
|
||||
"keywords": [
|
||||
"SSS",
|
||||
"SUBSURFACE*"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_FUZZ",
|
||||
"keywords": [
|
||||
"FUZZ"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_IDMAP",
|
||||
"keywords": [
|
||||
"ID*",
|
||||
"IDMAP"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_MASK",
|
||||
"keywords": [
|
||||
"OPAC*",
|
||||
"TRANS*",
|
||||
"MASK*",
|
||||
"ALPHA*"
|
||||
]
|
||||
},
|
||||
{
|
||||
"target_type": "MAP_METAL",
|
||||
"keywords": [
|
||||
"METALNESS_",
|
||||
"METALLIC"
|
||||
]
|
||||
}
|
||||
],
|
||||
"asset_category_rules": {
|
||||
"model_patterns": [
|
||||
"*.fbx",
|
||||
"*.obj",
|
||||
"*.blend",
|
||||
"*.mtl"
|
||||
],
|
||||
"decal_keywords": [
|
||||
"Decal"
|
||||
]
|
||||
},
|
||||
"archetype_rules": [
|
||||
[
|
||||
"Foliage",
|
||||
{
|
||||
"match_any": [
|
||||
"Plant",
|
||||
"Leaf",
|
||||
"Leaves",
|
||||
"Grass"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Fabric",
|
||||
{
|
||||
"match_any": [
|
||||
"Fabric",
|
||||
"Carpet",
|
||||
"Cloth",
|
||||
"Textile",
|
||||
"Leather"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Wood",
|
||||
{
|
||||
"match_any": [
|
||||
"Wood",
|
||||
"Timber",
|
||||
"Plank",
|
||||
"Board"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Metal",
|
||||
{
|
||||
"match_any": [
|
||||
"_Metal",
|
||||
"Steel",
|
||||
"Iron",
|
||||
"Gold",
|
||||
"Copper",
|
||||
"Chrome",
|
||||
"Aluminum",
|
||||
"Brass",
|
||||
"Bronze"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Concrete",
|
||||
{
|
||||
"match_any": [
|
||||
"Concrete",
|
||||
"Cement"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Ground",
|
||||
{
|
||||
"match_any": [
|
||||
"Ground",
|
||||
"Dirt",
|
||||
"Soil",
|
||||
"Mud",
|
||||
"Sand",
|
||||
"Gravel",
|
||||
"Asphalt",
|
||||
"Road",
|
||||
"Moss"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Stone",
|
||||
{
|
||||
"match_any": [
|
||||
"Stone",
|
||||
"Rock*",
|
||||
"Marble",
|
||||
"Granite",
|
||||
"Brick",
|
||||
"Tile",
|
||||
"Paving",
|
||||
"Pebble*",
|
||||
"Terrazzo",
|
||||
"Slate"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Plaster",
|
||||
{
|
||||
"match_any": [
|
||||
"Plaster",
|
||||
"Stucco",
|
||||
"Wall",
|
||||
"Paint"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Plastic",
|
||||
{
|
||||
"match_any": [
|
||||
"Plastic",
|
||||
"PVC",
|
||||
"Resin",
|
||||
"Rubber"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
],
|
||||
[
|
||||
"Glass",
|
||||
{
|
||||
"match_any": [
|
||||
"Glass"
|
||||
],
|
||||
"match_all": []
|
||||
}
|
||||
]
|
||||
]
|
||||
}
|
||||
@ -1,44 +0,0 @@
|
||||
{
|
||||
"ASSET_TYPE_DEFINITIONS": {
|
||||
"Surface": {
|
||||
"color": "#1f3e5d",
|
||||
"description": "A single Standard PBR material set for a surface.",
|
||||
"examples": [
|
||||
"Set: Wood01_COL + Wood01_NRM + WOOD01_ROUGH",
|
||||
"Set: Dif_Concrete + Normal_Concrete + Refl_Concrete"
|
||||
]
|
||||
},
|
||||
"Model": {
|
||||
"color": "#b67300",
|
||||
"description": "A set that contains models, can include PBR textureset",
|
||||
"examples": [
|
||||
"Single = Chair.fbx",
|
||||
"Set = Plant02.fbx + Plant02_col + Plant02_SSS"
|
||||
]
|
||||
},
|
||||
"Decal": {
|
||||
"color": "#68ac68",
|
||||
"description": "A alphamasked textureset",
|
||||
"examples": [
|
||||
"Set = DecalGraffiti01_Col + DecalGraffiti01_Alpha",
|
||||
"Single = DecalLeakStain03"
|
||||
]
|
||||
},
|
||||
"Atlas": {
|
||||
"color": "#955b8b",
|
||||
"description": "A texture, name usually hints that it's an atlas",
|
||||
"examples": [
|
||||
"Set = FoliageAtlas01_col + FoliageAtlas01_nrm"
|
||||
]
|
||||
},
|
||||
"UtilityMap": {
|
||||
"color": "#706b87",
|
||||
"description": "A useful image-asset consisting of only a single texture. Therefor each Utilitymap can only contain a single item.",
|
||||
"examples": [
|
||||
"Single = imperfection.png",
|
||||
"Single = smudges.png",
|
||||
"Single = scratches.tif"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -1,210 +0,0 @@
|
||||
{
|
||||
"FILE_TYPE_DEFINITIONS": {
|
||||
"MAP_COL": {
|
||||
"bit_depth_rule": "force_8bit",
|
||||
"color": "#ffaa00",
|
||||
"description": "Color/Albedo Map",
|
||||
"examples": [
|
||||
"_col.",
|
||||
"_basecolor.",
|
||||
"albedo",
|
||||
"diffuse"
|
||||
],
|
||||
"is_grayscale": false,
|
||||
"keybind": "C",
|
||||
"standard_type": "COL"
|
||||
},
|
||||
"MAP_NRM": {
|
||||
"bit_depth_rule": "respect",
|
||||
"color": "#cca2f1",
|
||||
"description": "Normal Map",
|
||||
"examples": [
|
||||
"_nrm.",
|
||||
"_normal."
|
||||
],
|
||||
"is_grayscale": false,
|
||||
"keybind": "N",
|
||||
"standard_type": "NRM"
|
||||
},
|
||||
"MAP_METAL": {
|
||||
"bit_depth_rule": "force_8bit",
|
||||
"color": "#dcf4f2",
|
||||
"description": "Metalness Map",
|
||||
"examples": [
|
||||
"_metal.",
|
||||
"_met."
|
||||
],
|
||||
"is_grayscale": true,
|
||||
"keybind": "M",
|
||||
"standard_type": "METAL"
|
||||
},
|
||||
"MAP_ROUGH": {
|
||||
"bit_depth_rule": "force_8bit",
|
||||
"color": "#bfd6bf",
|
||||
"description": "Roughness Map",
|
||||
"examples": [
|
||||
"_rough.",
|
||||
"_rgh.",
|
||||
"_gloss"
|
||||
],
|
||||
"is_grayscale": true,
|
||||
"keybind": "R",
|
||||
"standard_type": "ROUGH"
|
||||
},
|
||||
"MAP_GLOSS": {
|
||||
"bit_depth_rule": "force_8bit",
|
||||
"color": "#d6bfd6",
|
||||
"description": "Glossiness Map",
|
||||
"examples": [
|
||||
"_gloss.",
|
||||
"_gls."
|
||||
],
|
||||
"is_grayscale": true,
|
||||
"keybind": "R",
|
||||
"standard_type": "GLOSS"
|
||||
},
|
||||
"MAP_AO": {
|
||||
"bit_depth_rule": "force_8bit",
|
||||
"color": "#e3c7c7",
|
||||
"description": "Ambient Occlusion Map",
|
||||
"examples": [
|
||||
"_ao.",
|
||||
"_ambientocclusion."
|
||||
],
|
||||
"is_grayscale": true,
|
||||
"keybind": "",
|
||||
"standard_type": "AO"
|
||||
},
|
||||
"MAP_DISP": {
|
||||
"bit_depth_rule": "respect",
|
||||
"color": "#c6ddd5",
|
||||
"description": "Displacement/Height Map",
|
||||
"examples": [
|
||||
"_disp.",
|
||||
"_height."
|
||||
],
|
||||
"is_grayscale": true,
|
||||
"keybind": "D",
|
||||
"standard_type": "DISP"
|
||||
},
|
||||
"MAP_REFL": {
|
||||
"bit_depth_rule": "force_8bit",
|
||||
"color": "#c2c2b9",
|
||||
"description": "Reflection/Specular Map",
|
||||
"examples": [
|
||||
"_refl.",
|
||||
"_specular."
|
||||
],
|
||||
"is_grayscale": true,
|
||||
"keybind": "M",
|
||||
"standard_type": "REFL"
|
||||
},
|
||||
"MAP_SSS": {
|
||||
"bit_depth_rule": "respect",
|
||||
"color": "#a0d394",
|
||||
"description": "Subsurface Scattering Map",
|
||||
"examples": [
|
||||
"_sss.",
|
||||
"_subsurface."
|
||||
],
|
||||
"is_grayscale": true,
|
||||
"keybind": "",
|
||||
"standard_type": "SSS"
|
||||
},
|
||||
"MAP_FUZZ": {
|
||||
"bit_depth_rule": "force_8bit",
|
||||
"color": "#a2d1da",
|
||||
"description": "Fuzz/Sheen Map",
|
||||
"examples": [
|
||||
"_fuzz.",
|
||||
"_sheen."
|
||||
],
|
||||
"is_grayscale": true,
|
||||
"keybind": "",
|
||||
"standard_type": "FUZZ"
|
||||
},
|
||||
"MAP_IDMAP": {
|
||||
"bit_depth_rule": "force_8bit",
|
||||
"color": "#ca8fb4",
|
||||
"description": "ID Map (for masking)",
|
||||
"examples": [
|
||||
"_id.",
|
||||
"_matid."
|
||||
],
|
||||
"is_grayscale": false,
|
||||
"keybind": "",
|
||||
"standard_type": "IDMAP"
|
||||
},
|
||||
"MAP_MASK": {
|
||||
"bit_depth_rule": "force_8bit",
|
||||
"color": "#c6e2bf",
|
||||
"description": "Generic Mask Map",
|
||||
"examples": [
|
||||
"_mask."
|
||||
],
|
||||
"is_grayscale": true,
|
||||
"keybind": "",
|
||||
"standard_type": "MASK"
|
||||
},
|
||||
"MAP_IMPERFECTION": {
|
||||
"bit_depth_rule": "force_8bit",
|
||||
"color": "#e6d1a6",
|
||||
"description": "Imperfection Map (scratches, dust)",
|
||||
"examples": [
|
||||
"_imp.",
|
||||
"_imperfection.",
|
||||
"splatter",
|
||||
"scratches",
|
||||
"smudges",
|
||||
"hairs",
|
||||
"fingerprints"
|
||||
],
|
||||
"is_grayscale": true,
|
||||
"keybind": "",
|
||||
"standard_type": "IMPERFECTION"
|
||||
},
|
||||
"MODEL": {
|
||||
"bit_depth_rule": "",
|
||||
"color": "#3db2bd",
|
||||
"description": "3D Model File",
|
||||
"examples": [
|
||||
".fbx",
|
||||
".obj"
|
||||
],
|
||||
"is_grayscale": false,
|
||||
"keybind": "",
|
||||
"standard_type": ""
|
||||
},
|
||||
"EXTRA": {
|
||||
"bit_depth_rule": "",
|
||||
"color": "#8c8c8c",
|
||||
"description": "asset previews or metadata",
|
||||
"examples": [
|
||||
".txt",
|
||||
".zip",
|
||||
"preview.",
|
||||
"_flat.",
|
||||
"_sphere.",
|
||||
"_Cube.",
|
||||
"thumb"
|
||||
],
|
||||
"is_grayscale": false,
|
||||
"keybind": "E",
|
||||
"standard_type": "EXTRA"
|
||||
},
|
||||
"FILE_IGNORE": {
|
||||
"bit_depth_rule": "",
|
||||
"color": "#673d35",
|
||||
"description": "File identified to be ignored due to prioritization rules (e.g., a lower bit-depth version when a higher one is present).",
|
||||
"category": "Ignored",
|
||||
"examples": [
|
||||
"Thumbs.db",
|
||||
".DS_Store"
|
||||
],
|
||||
"is_grayscale": false,
|
||||
"keybind": "X",
|
||||
"standard_type": "",
|
||||
"details": {}
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -1,267 +0,0 @@
|
||||
{
|
||||
"llm_predictor_examples": [
|
||||
{
|
||||
"input": "MessyTextures/Concrete_Damage_Set/concrete_col.png\nMessyTextures/Concrete_Damage_Set/concrete_N.png\nMessyTextures/Concrete_Damage_Set/concrete_rough.jpg\nMessyTextures/Concrete_Damage_Set/height_map_concrete.tif\nMessyTextures/Concrete_Damage_Set/Thumbs.db\nMessyTextures/Fabric_Pattern/pattern_01_diffuse.tga\nMessyTextures/Fabric_Pattern/pattern_01_ao.png\nMessyTextures/Fabric_Pattern/pattern_01_normal.png\nMessyTextures/Fabric_Pattern/notes.txt\nMessyTextures/Fabric_Pattern/variant_blue_diffuse.tga\nMessyTextures/Fabric_Pattern/fabric_flat.jpg",
|
||||
"output": {
|
||||
"individual_file_analysis": [
|
||||
{
|
||||
"relative_file_path": "MessyTextures/Concrete_Damage_Set/concrete_col.png",
|
||||
"classified_file_type": "MAP_COL",
|
||||
"proposed_asset_group_name": "Concrete_Damage_Set"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "MessyTextures/Concrete_Damage_Set/concrete_N.png",
|
||||
"classified_file_type": "MAP_NRM",
|
||||
"proposed_asset_group_name": "Concrete_Damage_Set"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "MessyTextures/Concrete_Damage_Set/concrete_rough.jpg",
|
||||
"classified_file_type": "MAP_ROUGH",
|
||||
"proposed_asset_group_name": "Concrete_Damage_Set"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "MessyTextures/Concrete_Damage_Set/height_map_concrete.tif",
|
||||
"classified_file_type": "MAP_DISP",
|
||||
"proposed_asset_group_name": "Concrete_Damage_Set"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "MessyTextures/Concrete_Damage_Set/Thumbs.db",
|
||||
"classified_file_type": "FILE_IGNORE",
|
||||
"proposed_asset_group_name": null
|
||||
},
|
||||
{
|
||||
"relative_file_path": "MessyTextures/Fabric_Pattern/pattern_01_diffuse.tga",
|
||||
"classified_file_type": "MAP_COL",
|
||||
"proposed_asset_group_name": "Fabric_Pattern_01"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "MessyTextures/Fabric_Pattern/pattern_01_ao.png",
|
||||
"classified_file_type": "MAP_AO",
|
||||
"proposed_asset_group_name": "Fabric_Pattern_01"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "MessyTextures/Fabric_Pattern/pattern_01_normal.png",
|
||||
"classified_file_type": "MAP_NRM",
|
||||
"proposed_asset_group_name": "Fabric_Pattern_01"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "MessyTextures/Fabric_Pattern/notes.txt",
|
||||
"classified_file_type": "EXTRA",
|
||||
"proposed_asset_group_name": "Fabric_Pattern_01"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "MessyTextures/Fabric_Pattern/variant_blue_diffuse.tga",
|
||||
"classified_file_type": "MAP_COL",
|
||||
"proposed_asset_group_name": "Fabric_Pattern_01"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "MessyTextures/Fabric_Pattern/fabric_flat.jpg",
|
||||
"classified_file_type": "EXTRA",
|
||||
"proposed_asset_group_name": "Fabric_Pattern_01"
|
||||
}
|
||||
],
|
||||
"asset_group_classifications": {
|
||||
"Concrete_Damage_Set": "Surface",
|
||||
"Fabric_Pattern_01": "Surface"
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"input": "SciFi_Drone/Drone_Model.fbx\nSciFi_Drone/Textures/Drone_BaseColor.png\nSciFi_Drone/Textures/Drone_Metallic.png\nSciFi_Drone/Textures/Drone_Roughness.png\nSciFi_Drone/Textures/Drone_Normal.png\nSciFi_Drone/Textures/Drone_Emissive.jpg\nSciFi_Drone/ReferenceImages/concept.jpg",
|
||||
"output": {
|
||||
"individual_file_analysis": [
|
||||
{
|
||||
"relative_file_path": "SciFi_Drone/Drone_Model.fbx",
|
||||
"classified_file_type": "MODEL",
|
||||
"proposed_asset_group_name": "SciFi_Drone"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "SciFi_Drone/Textures/Drone_BaseColor.png",
|
||||
"classified_file_type": "MAP_COL",
|
||||
"proposed_asset_group_name": "SciFi_Drone"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "SciFi_Drone/Textures/Drone_Metallic.png",
|
||||
"classified_file_type": "MAP_METAL",
|
||||
"proposed_asset_group_name": "SciFi_Drone"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "SciFi_Drone/Textures/Drone_Roughness.png",
|
||||
"classified_file_type": "MAP_ROUGH",
|
||||
"proposed_asset_group_name": "SciFi_Drone"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "SciFi_Drone/Textures/Drone_Normal.png",
|
||||
"classified_file_type": "MAP_NRM",
|
||||
"proposed_asset_group_name": "SciFi_Drone"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "SciFi_Drone/Textures/Drone_Emissive.jpg",
|
||||
"classified_file_type": "EXTRA",
|
||||
"proposed_asset_group_name": "SciFi_Drone"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "SciFi_Drone/ReferenceImages/concept.jpg",
|
||||
"classified_file_type": "EXTRA",
|
||||
"proposed_asset_group_name": "SciFi_Drone"
|
||||
}
|
||||
],
|
||||
"asset_group_classifications": {
|
||||
"SciFi_Drone": "Model"
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"input": "21_hairs_deposits.tif\n22_hairs_fabric.tif\n23_hairs_fibres.tif\n24_hairs_fibres.tif\n25_bonus_isolatedFingerprints.tif\n26_bonus_isolatedPalmprint.tif\n27_metal_aluminum.tif\n28_metal_castIron.tif\n29_scratcehes_deposits_shapes.tif\n30_scratches_deposits.tif",
|
||||
"output": {
|
||||
"individual_file_analysis": [
|
||||
{
|
||||
"relative_file_path": "21_hairs_deposits.tif",
|
||||
"classified_file_type": "MAP_IMPERFECTION",
|
||||
"proposed_asset_group_name": "Hairs_Deposits_21"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "22_hairs_fabric.tif",
|
||||
"classified_file_type": "MAP_IMPERFECTION",
|
||||
"proposed_asset_group_name": "Hairs_Fabric_22"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "23_hairs_fibres.tif",
|
||||
"classified_file_type": "MAP_IMPERFECTION",
|
||||
"proposed_asset_group_name": "Hairs_Fibres_23"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "24_hairs_fibres.tif",
|
||||
"classified_file_type": "MAP_IMPERFECTION",
|
||||
"proposed_asset_group_name": "Hairs_Fibres_24"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "25_bonus_isolatedFingerprints.tif",
|
||||
"classified_file_type": "MAP_IMPERFECTION",
|
||||
"proposed_asset_group_name": "Bonus_IsolatedFingerprints_25"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "26_bonus_isolatedPalmprint.tif",
|
||||
"classified_file_type": "MAP_IMPERFECTION",
|
||||
"proposed_asset_group_name": "Bonus_IsolatedPalmprint_26"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "27_metal_aluminum.tif",
|
||||
"classified_file_type": "MAP_IMPERFECTION",
|
||||
"proposed_asset_group_name": "Metal_Aluminum_27"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "28_metal_castIron.tif",
|
||||
"classified_file_type": "MAP_IMPERFECTION",
|
||||
"proposed_asset_group_name": "Metal_CastIron_28"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "29_scratcehes_deposits_shapes.tif",
|
||||
"classified_file_type": "MAP_IMPERFECTION",
|
||||
"proposed_asset_group_name": "Scratches_Deposits_Shapes_29"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "30_scratches_deposits.tif",
|
||||
"classified_file_type": "MAP_IMPERFECTION",
|
||||
"proposed_asset_group_name": "Scratches_Deposits_30"
|
||||
}
|
||||
],
|
||||
"asset_group_classifications": {
|
||||
"Hairs_Deposits_21": "UtilityMap",
|
||||
"Hairs_Fabric_22": "UtilityMap",
|
||||
"Hairs_Fibres_23": "UtilityMap",
|
||||
"Hairs_Fibres_24": "UtilityMap",
|
||||
"Bonus_IsolatedFingerprints_25": "UtilityMap",
|
||||
"Bonus_IsolatedPalmprint_26": "UtilityMap",
|
||||
"Metal_Aluminum_27": "UtilityMap",
|
||||
"Metal_CastIron_28": "UtilityMap",
|
||||
"Scratches_Deposits_Shapes_29": "UtilityMap",
|
||||
"Scratches_Deposits_30": "UtilityMap"
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"input": "Part1/TextureSupply_Boards001_A_28x300cm-Albedo.jpg\nPart1/TextureSupply_Boards001_A_28x300cm-Normal.jpg\nPart1/TextureSupply_Boards001_B_28x300cm-Albedo.jpg\nPart1/TextureSupply_Boards001_B_28x300cm-Normal.jpg\nPart1/TextureSupply_Boards001_C_28x300cm-Albedo.jpg\nPart1/TextureSupply_Boards001_C_28x300cm-Normal.jpg\nPart1/TextureSupply_Boards001_D_28x300cm-Albedo.jpg\nPart1/TextureSupply_Boards001_D_28x300cm-Normal.jpg\nPart1/TextureSupply_Boards001_E_28x300cm-Albedo.jpg\nPart1/TextureSupply_Boards001_E_28x300cm-Normal.jpg\nPart1/TextureSupply_Boards001_F_28x300cm-Albedo.jpg\nPart1/TextureSupply_Boards001_F_28x300cm-Normal.jpg",
|
||||
"output": {
|
||||
"individual_file_analysis": [
|
||||
{
|
||||
"relative_file_path": "Part1/TextureSupply_Boards001_A_28x300cm-Albedo.jpg",
|
||||
"classified_file_type": "MAP_COL",
|
||||
"proposed_asset_group_name": "Boards001_A"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "Part1/TextureSupply_Boards001_A_28x300cm-Normal.jpg",
|
||||
"classified_file_type": "MAP_NRM",
|
||||
"proposed_asset_group_name": "Boards001_A"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "Part1/TextureSupply_Boards001_B_28x300cm-Albedo.jpg",
|
||||
"classified_file_type": "MAP_COL",
|
||||
"proposed_asset_group_name": "Boards001_B"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "Part1/TextureSupply_Boards001_B_28x300cm-Normal.jpg",
|
||||
"classified_file_type": "MAP_NRM",
|
||||
"proposed_asset_group_name": "Boards001_B"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "Part1/TextureSupply_Boards001_C_28x300cm-Albedo.jpg",
|
||||
"classified_file_type": "MAP_COL",
|
||||
"proposed_asset_group_name": "Boards001_C"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "Part1/TextureSupply_Boards001_C_28x300cm-Normal.jpg",
|
||||
"classified_file_type": "MAP_NRM",
|
||||
"proposed_asset_group_name": "Boards001_C"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "Part1/TextureSupply_Boards001_D_28x300cm-Albedo.jpg",
|
||||
"classified_file_type": "MAP_COL",
|
||||
"proposed_asset_group_name": "Boards001_D"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "Part1/TextureSupply_Boards001_D_28x300cm-Normal.jpg",
|
||||
"classified_file_type": "MAP_NRM",
|
||||
"proposed_asset_group_name": "Boards001_D"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "Part1/TextureSupply_Boards001_E_28x300cm-Albedo.jpg",
|
||||
"classified_file_type": "MAP_COL",
|
||||
"proposed_asset_group_name": "Boards001_E"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "Part1/TextureSupply_Boards001_E_28x300cm-Normal.jpg",
|
||||
"classified_file_type": "MAP_NRM",
|
||||
"proposed_asset_group_name": "Boards001_E"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "Part1/TextureSupply_Boards001_F_28x300cm-Albedo.jpg",
|
||||
"classified_file_type": "MAP_COL",
|
||||
"proposed_asset_group_name": "Boards001_F"
|
||||
},
|
||||
{
|
||||
"relative_file_path": "Part1/TextureSupply_Boards001_F_28x300cm-Normal.jpg",
|
||||
"classified_file_type": "MAP_NRM",
|
||||
"proposed_asset_group_name": "Boards001_F"
|
||||
}
|
||||
],
|
||||
"asset_group_classifications": {
|
||||
"Boards001_A": "Surface",
|
||||
"Boards001_B": "Surface",
|
||||
"Boards001_C": "Surface",
|
||||
"Boards001_D": "Surface",
|
||||
"Boards001_E": "Surface",
|
||||
"Boards001_F": "Surface"
|
||||
}
|
||||
}
|
||||
}
|
||||
],
|
||||
"asset_type_definition_format": "{KEY} = {DESCRIPTION}, examples of content of {KEY} could be: {EXAMPLES}",
|
||||
"file_type_definition_format": "{KEY} = {DESCRIPTION}, examples of keywords for {KEY} could be: {EXAMPLES}",
|
||||
"llm_endpoint_url": "http://100.65.14.122:1234/v1/chat/completions",
|
||||
"llm_api_key": "",
|
||||
"llm_model_name": "qwen2.5-coder:3b",
|
||||
"llm_temperature": 0.5,
|
||||
"llm_request_timeout": 120,
|
||||
"llm_predictor_prompt": "You are an expert asset classification system. Your task is to analyze a list of file paths, understand their relationships based on naming and directory structure, and output a structured JSON object that classifies each file individually and then classifies the logical asset groups they belong to.\\n\\nDefinitions:\\n\\nAsset Types: These define the overall category of a logical asset group. Use one of the following keys when classifying asset groups. Each definition is provided as a formatted string (e.g., 'Surface = A single PBR material set..., examples: WoodFloor01, MetalPlate05'):\\n{ASSET_TYPE_DEFINITIONS}\\n\\n\\nFile Types: These define the specific purpose of each individual file. Use one of the following keys when classifying individual files. Each definition is provided as a formatted string (e.g., 'MAP_COL = Color/Albedo Map, examples: _col., _basecolor.'):\\n{FILE_TYPE_DEFINITIONS}\\n\\n\\nCore Task & Logic:\\n\\n1. **Individual File Analysis:**\\n * Examine each `relative_file_path` in the input `FILE_LIST`.\\n * For EACH file, determine its most likely `classified_file_type` using the `FILE_TYPE_DEFINITIONS`. Pay attention to filename suffixes, keywords, and extensions. Use `FILE_IGNORE` for files like `Thumbs.db` or `.DS_Store`. Use `EXTRA` for previews, metadata, or unidentifiable maps.\\n * For EACH file, propose a logical `proposed_asset_group_name` (string). This name should represent the asset the file likely belongs to, based on common base names (e.g., `WoodFloor01` from `WoodFloor01_col.png`, `WoodFloor01_nrm.png`) or directory structure (e.g., `SciFi_Drone` for files within that folder).\\n * Files that seem to be standalone utility maps (like `scratches.png`, `FlowMap.tif`) should get a unique group name derived from their filename (e.g., `Scratches`, `FlowMap`).\\n * If a file doesn't seem to belong to any logical group (e.g., a stray readme file in the root), you can propose `null` or a generic name like `Miscellaneous`.\\n * Be consistent with the proposed names for files belonging to the same logical asset.\\n * Populate the `individual_file_analysis` array with one object for *every* file in the input list, containing `relative_file_path`, `classified_file_type`, and `proposed_asset_group_name`.\\n\\n2. **Asset Group Classification:**\\n * Collect all unique, non-null `proposed_asset_group_name` values generated in the previous step.\\n * For EACH unique group name, determine the overall `asset_type` (using `ASSET_TYPE_DEFINITIONS`) based on the types of files assigned to that group name in the `individual_file_analysis`.\\n * Example: If files proposed as `AssetGroup1` include `MAP_COL`, `MAP_NRM`, `MAP_ROUGH`, classify `AssetGroup1` as `Surface`.\\n * Example: If files proposed as `AssetGroup2` include `MODEL` and texture maps, classify `AssetGroup2` as `Model`.\\n * Example: If `AssetGroup3` only has one file classified as `MAP_IMPERFECTION`, classify `AssetGroup3` as `UtilityMap`.\\n * Populate the `asset_group_classifications` dictionary, mapping each unique `proposed_asset_group_name` to its determined `asset_type`.\\n\\nInput File List:\\n\\ntext\\n{FILE_LIST}\\n\\n\\nOutput Format:\\n\\nYour response MUST be ONLY a single JSON object. You MAY include comments (using // or /* */) within the JSON structure for clarification if needed, but the core structure must be valid JSON. Do NOT include any text, explanations, or introductory phrases before or after the JSON object itself. Ensure all strings are correctly quoted and escaped.\\n\\nCRITICAL: The output JSON structure must strictly adhere to the following format:\\n\\n```json\\n{{\\n \"individual_file_analysis\": [\\n {{\\n // Optional comment about this file\\n \"relative_file_path\": \"string\", // Exact relative path from the input list\\n \"classified_file_type\": \"string\", // Key from FILE_TYPE_DEFINITIONS\\n \"proposed_asset_group_name\": \"string_or_null\" // Your suggested group name for this file\\n }}\\n // ... one object for EVERY file in the input list\\n ],\\n \"asset_group_classifications\": {{\\n // Dictionary mapping unique proposed group names to asset types\\n \"ProposedGroupName1\": \"string\", // Key: proposed_asset_group_name, Value: Key from ASSET_TYPE_DEFINITIONS\\n \"ProposedGroupName2\": \"string\"\\n // ... one entry for each unique, non-null proposed_asset_group_name\\n }}\\n}}\\n```\\n\\nExamples:\\n\\nHere are examples of input file lists and the desired JSON output, illustrating the two-part structure:\\n\\njson\\n[\\n {EXAMPLE_INPUT_OUTPUT_PAIRS}\\n]\\n\\n\\nNow, process the provided FILE_LIST and generate ONLY the JSON output according to these instructions. Remember to include an entry in `individual_file_analysis` for every single input file path."
|
||||
}
|
||||
@ -1,11 +0,0 @@
|
||||
{
|
||||
"Dimensiva": {
|
||||
"normal_map_type": "OpenGL"
|
||||
},
|
||||
"Dinesen": {
|
||||
"normal_map_type": "OpenGL"
|
||||
},
|
||||
"Poliigon": {
|
||||
"normal_map_type": "OpenGL"
|
||||
}
|
||||
}
|
||||
@ -1,8 +0,0 @@
|
||||
{
|
||||
"OUTPUT_BASE_DIR": "G:/02 Content/10-19 Content/13 Textures Power of Two/TestOutput",
|
||||
"OUTPUT_DIRECTORY_PATTERN": "[supplier]/[asset_category]/[asset_name]",
|
||||
"OUTPUT_FORMAT_16BIT_PRIMARY": "png",
|
||||
"OUTPUT_FORMAT_8BIT": "png",
|
||||
"RESOLUTION_THRESHOLD_FOR_JPG": 4096,
|
||||
"general_settings": {}
|
||||
}
|
||||
@ -1,66 +0,0 @@
|
||||
import os
|
||||
import sys
|
||||
import platform
|
||||
|
||||
def get_app_data_dir():
|
||||
"""
|
||||
Gets the OS-specific application data directory for Asset Processor.
|
||||
Uses standard library methods as appdirs is not available.
|
||||
"""
|
||||
app_name = "AssetProcessor"
|
||||
if platform.system() == "Windows":
|
||||
# On Windows, use APPDATA environment variable
|
||||
app_data_dir = os.path.join(os.environ.get("APPDATA", "~"), app_name)
|
||||
elif platform.system() == "Darwin":
|
||||
# On macOS, use ~/Library/Application Support
|
||||
app_data_dir = os.path.join("~", "Library", "Application Support", app_name)
|
||||
else:
|
||||
# On Linux and other Unix-like systems, use ~/.config
|
||||
app_data_dir = os.path.join("~", ".config", app_name)
|
||||
|
||||
# Expand the user home directory symbol if present
|
||||
return os.path.expanduser(app_data_dir)
|
||||
|
||||
def get_persistent_config_path_file():
|
||||
"""
|
||||
Gets the full path to the file storing the user's chosen config directory.
|
||||
"""
|
||||
app_data_dir = get_app_data_dir()
|
||||
# Ensure the app data directory exists
|
||||
os.makedirs(app_data_dir, exist_ok=True)
|
||||
return os.path.join(app_data_dir, "asset_processor_user_root.txt")
|
||||
|
||||
def read_saved_user_config_path():
|
||||
"""
|
||||
Reads the saved user config path from the persistent file.
|
||||
Returns the path string or None if the file doesn't exist or is empty.
|
||||
"""
|
||||
path_file = get_persistent_config_path_file()
|
||||
if os.path.exists(path_file):
|
||||
try:
|
||||
with open(path_file, "r", encoding="utf-8") as f:
|
||||
saved_path = f.read().strip()
|
||||
if saved_path:
|
||||
return saved_path
|
||||
except IOError:
|
||||
# Handle potential file reading errors
|
||||
pass
|
||||
return None
|
||||
|
||||
def save_user_config_path(user_config_path):
|
||||
"""
|
||||
Saves the user's chosen config path to the persistent file.
|
||||
"""
|
||||
path_file = get_persistent_config_path_file()
|
||||
try:
|
||||
with open(path_file, "w", encoding="utf-8") as f:
|
||||
f.write(user_config_path)
|
||||
except IOError:
|
||||
# Handle potential file writing errors
|
||||
print(f"Error saving user config path to {path_file}", file=sys.stderr)
|
||||
|
||||
def get_first_run_marker_file(user_config_path):
|
||||
"""
|
||||
Gets the full path to the first-run marker file within the user config directory.
|
||||
"""
|
||||
return os.path.join(user_config_path, ".first_run_complete")
|
||||
@ -9,7 +9,6 @@ from typing import Optional, Dict
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
def generate_path_from_pattern(pattern_string: str, token_data: dict) -> str:
|
||||
logger.debug(f"generate_path_from_pattern called with pattern: '{pattern_string}', token_data keys: {list(token_data.keys())}")
|
||||
"""
|
||||
Generates a file path by replacing tokens in a pattern string with values
|
||||
from the provided token_data dictionary.
|
||||
@ -55,8 +54,7 @@ def generate_path_from_pattern(pattern_string: str, token_data: dict) -> str:
|
||||
# Add variations like #### for IncrementingValue
|
||||
known_tokens_lc = {
|
||||
'assettype', 'supplier', 'assetname', 'resolution', 'ext',
|
||||
'incrementingvalue', '####', 'date', 'time', 'sha5', 'applicationpath',
|
||||
'asset_category'
|
||||
'incrementingvalue', '####', 'date', 'time', 'sha5', 'applicationpath'
|
||||
}
|
||||
|
||||
output_path = pattern_string
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user