Uncompleted Processing Refactor
This commit is contained in:
parent
d473ddd7f4
commit
12cf557dd7
181
ProjectNotes/ProcessingEngineRefactorPlan.md
Normal file
181
ProjectNotes/ProcessingEngineRefactorPlan.md
Normal file
@ -0,0 +1,181 @@
|
||||
# Project Plan: Modularizing the Asset Processing Engine
|
||||
|
||||
**Last Updated:** May 9, 2025
|
||||
|
||||
**1. Project Vision & Goals**
|
||||
|
||||
* **Vision:** Transform the asset processing pipeline into a highly modular, extensible, and testable system.
|
||||
* **Primary Goals:**
|
||||
1. Decouple processing steps into independent, reusable stages.
|
||||
2. Simplify the addition of new processing capabilities (e.g., GLOSS > ROUGH conversion, Alpha to MASK, Normal Map Green Channel inversion).
|
||||
3. Improve code maintainability and readability.
|
||||
4. Enhance unit and integration testing capabilities for each processing component.
|
||||
5. Centralize common utility functions (image manipulation, path generation).
|
||||
|
||||
**2. Proposed Architecture Overview**
|
||||
|
||||
* **Core Concept:** A `PipelineOrchestrator` will manage a sequence of `ProcessingStage`s. Each stage will operate on an `AssetProcessingContext` object, which carries all necessary data and state for a single asset through the pipeline.
|
||||
* **Key Components:**
|
||||
* `AssetProcessingContext`: Data class holding asset-specific data, configuration, temporary paths, and status.
|
||||
* `PipelineOrchestrator`: Class to manage the overall processing flow for a `SourceRule`, iterating through assets and executing the pipeline of stages for each.
|
||||
* `ProcessingStage` (Base Class/Interface): Defines the contract for all individual processing stages (e.g., `execute(context)` method).
|
||||
* Specific Stage Classes: (e.g., `SupplierDeterminationStage`, `IndividualMapProcessingStage`, etc.)
|
||||
* Utility Modules: `image_processing_utils.py`, enhancements to `utils/path_utils.py`.
|
||||
|
||||
**3. Proposed File Structure**
|
||||
|
||||
* `processing/`
|
||||
* `pipeline/`
|
||||
* `__init__.py`
|
||||
* `asset_context.py` (Defines `AssetProcessingContext`)
|
||||
* `orchestrator.py` (Defines `PipelineOrchestrator`)
|
||||
* `stages/`
|
||||
* `__init__.py`
|
||||
* `base_stage.py` (Defines `ProcessingStage` interface)
|
||||
* `supplier_determination.py`
|
||||
* `asset_skip_logic.py`
|
||||
* `metadata_initialization.py`
|
||||
* `file_rule_filter.py`
|
||||
* `gloss_to_rough_conversion.py`
|
||||
* `alpha_extraction_to_mask.py`
|
||||
* `normal_map_green_channel.py`
|
||||
* `individual_map_processing.py`
|
||||
* `map_merging.py`
|
||||
* `metadata_finalization.py`
|
||||
* `output_organization.py`
|
||||
* `utils/`
|
||||
* `__init__.py`
|
||||
* `image_processing_utils.py` (New module for image functions)
|
||||
* `utils/` (Top-level existing directory)
|
||||
* `path_utils.py` (To be enhanced with `sanitize_filename` from `processing_engine.py`)
|
||||
|
||||
**4. Detailed Phases and Tasks**
|
||||
|
||||
**Phase 0: Setup & Core Structures Definition**
|
||||
*Goal: Establish the foundational classes for the new pipeline.*
|
||||
* **Task 0.1: Define `AssetProcessingContext`**
|
||||
* Create `processing/pipeline/asset_context.py`.
|
||||
* Define the `AssetProcessingContext` data class with fields: `source_rule: SourceRule`, `asset_rule: AssetRule`, `workspace_path: Path`, `engine_temp_dir: Path`, `output_base_path: Path`, `effective_supplier: Optional[str]`, `asset_metadata: Dict`, `processed_maps_details: Dict[str, Dict[str, Dict]]`, `merged_maps_details: Dict[str, Dict[str, Dict]]`, `files_to_process: List[FileRule]`, `loaded_data_cache: Dict`, `config_obj: Configuration`, `status_flags: Dict`, `incrementing_value: Optional[str]`, `sha5_value: Optional[str]`.
|
||||
* Ensure proper type hinting.
|
||||
* **Task 0.2: Define `ProcessingStage` Base Class/Interface**
|
||||
* Create `processing/pipeline/stages/base_stage.py`.
|
||||
* Define an abstract base class `ProcessingStage` with an abstract method `execute(self, context: AssetProcessingContext) -> AssetProcessingContext`.
|
||||
* **Task 0.3: Implement Initial `PipelineOrchestrator`**
|
||||
* Create `processing/pipeline/orchestrator.py`.
|
||||
* Define the `PipelineOrchestrator` class.
|
||||
* Implement `__init__(self, config_obj: Configuration, stages: List[ProcessingStage])`.
|
||||
* Implement `process_source_rule(self, source_rule: SourceRule, workspace_path: Path, output_base_path: Path, overwrite: bool, incrementing_value: Optional[str], sha5_value: Optional[str]) -> Dict[str, List[str]]`.
|
||||
* Handles creation/cleanup of the main engine temporary directory.
|
||||
* Loops through `source_rule.assets`, initializes `AssetProcessingContext` for each.
|
||||
* Iterates `self.stages`, calling `stage.execute(context)`.
|
||||
* Collects overall status.
|
||||
|
||||
**Phase 1: Utility Module Refactoring**
|
||||
*Goal: Consolidate and centralize common utility functions.*
|
||||
* **Task 1.1: Refactor Path Utilities**
|
||||
* Move `_sanitize_filename` from `processing_engine.py` to `utils/path_utils.py`.
|
||||
* Update uses to call the new utility function.
|
||||
* **Task 1.2: Create `image_processing_utils.py`**
|
||||
* Create `processing/utils/image_processing_utils.py`.
|
||||
* Move general-purpose image functions from `processing_engine.py`:
|
||||
* `is_power_of_two`
|
||||
* `get_nearest_pot`
|
||||
* `calculate_target_dimensions`
|
||||
* `calculate_image_stats`
|
||||
* `normalize_aspect_ratio_change`
|
||||
* Core image loading, BGR<>RGB conversion, generic resizing (from `_load_and_transform_source`).
|
||||
* Core data type conversion for saving, color conversion for saving, `cv2.imwrite` call (from `_save_image`).
|
||||
* Ensure functions are pure and testable.
|
||||
|
||||
**Phase 2: Implementing Core Processing Stages (Migrating Existing Logic)**
|
||||
*Goal: Migrate existing functionalities from `processing_engine.py` into the new stage-based architecture.*
|
||||
(For each task: create stage file, implement class, move logic, adapt to `AssetProcessingContext`)
|
||||
* **Task 2.1: Implement `SupplierDeterminationStage`**
|
||||
* **Task 2.2: Implement `AssetSkipLogicStage`**
|
||||
* **Task 2.3: Implement `MetadataInitializationStage`**
|
||||
* **Task 2.4: Implement `FileRuleFilterStage`** (New logic for `item_type == "FILE_IGNORE"`)
|
||||
* **Task 2.5: Implement `IndividualMapProcessingStage`** (Adapts `_process_individual_maps`, uses `image_processing_utils.py`)
|
||||
* **Task 2.6: Implement `MapMergingStage`** (Adapts `_merge_maps`, uses `image_processing_utils.py`)
|
||||
* **Task 2.7: Implement `MetadataFinalizationAndSaveStage`** (Adapts `_generate_metadata_file`, uses `utils.path_utils.generate_path_from_pattern`)
|
||||
* **Task 2.8: Implement `OutputOrganizationStage`** (Adapts `_organize_output_files`)
|
||||
|
||||
**Phase 3: Implementing New Feature Stages**
|
||||
*Goal: Add the new desired processing capabilities as distinct stages.*
|
||||
* **Task 3.1: Implement `GlossToRoughConversionStage`** (Identify gloss, convert, invert, save temp, update `FileRule`)
|
||||
* **Task 3.2: Implement `AlphaExtractionToMaskStage`** (Check existing mask, find MAP_COL with alpha, extract, save temp, add new `FileRule`)
|
||||
* **Task 3.3: Implement `NormalMapGreenChannelStage`** (Identify normal maps, invert green based on config, save temp, update `FileRule`)
|
||||
|
||||
**Phase 4: Integration, Testing & Finalization**
|
||||
*Goal: Assemble the pipeline, test thoroughly, and deprecate old code.*
|
||||
* **Task 4.1: Configure `PipelineOrchestrator`**
|
||||
* Instantiate `PipelineOrchestrator` in main application logic with the ordered list of stage instances.
|
||||
* **Task 4.2: Unit Testing**
|
||||
* Unit tests for each `ProcessingStage` (mocking `AssetProcessingContext`).
|
||||
* Unit tests for `image_processing_utils.py` and `utils/path_utils.py` functions.
|
||||
* **Task 4.3: Integration Testing**
|
||||
* Test `PipelineOrchestrator` end-to-end with sample data.
|
||||
* Compare outputs with the existing engine for consistency.
|
||||
* **Task 4.4: Documentation Update**
|
||||
* Update developer documentation (e.g., `Documentation/02_Developer_Guide/05_Processing_Pipeline.md`).
|
||||
* Document `AssetProcessingContext` and stage responsibilities.
|
||||
* **Task 4.5: Deprecate/Remove Old `ProcessingEngine` Code**
|
||||
* Gradually remove refactored logic from `processing_engine.py`.
|
||||
|
||||
**5. Workflow Diagram**
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
AA[Load SourceRule & Config] --> BA(PipelineOrchestrator: process_source_rule);
|
||||
BA --> CA{For Each Asset in SourceRule};
|
||||
CA -- Yes --> DA(Orchestrator: Create AssetProcessingContext);
|
||||
DA --> EA(SupplierDeterminationStage);
|
||||
EA -- context --> FA(AssetSkipLogicStage);
|
||||
FA -- context --> GA{context.skip_asset?};
|
||||
GA -- Yes --> HA(Orchestrator: Record Skipped);
|
||||
HA --> CA;
|
||||
GA -- No --> IA(MetadataInitializationStage);
|
||||
IA -- context --> JA(FileRuleFilterStage);
|
||||
JA -- context --> KA(GlossToRoughConversionStage);
|
||||
KA -- context --> LA(AlphaExtractionToMaskStage);
|
||||
LA -- context --> MA(NormalMapGreenChannelStage);
|
||||
MA -- context --> NA(IndividualMapProcessingStage);
|
||||
NA -- context --> OA(MapMergingStage);
|
||||
OA -- context --> PA(MetadataFinalizationAndSaveStage);
|
||||
PA -- context --> QA(OutputOrganizationStage);
|
||||
QA -- context --> RA(Orchestrator: Record Processed/Failed);
|
||||
RA --> CA;
|
||||
CA -- No --> SA(Orchestrator: Cleanup Engine Temp Dir);
|
||||
SA --> TA[Processing Complete];
|
||||
|
||||
subgraph Stages
|
||||
direction LR
|
||||
EA
|
||||
FA
|
||||
IA
|
||||
JA
|
||||
KA
|
||||
LA
|
||||
MA
|
||||
NA
|
||||
OA
|
||||
PA
|
||||
QA
|
||||
end
|
||||
|
||||
subgraph Utils
|
||||
direction LR
|
||||
U1[image_processing_utils.py]
|
||||
U2[utils/path_utils.py]
|
||||
end
|
||||
|
||||
NA -.-> U1;
|
||||
OA -.-> U1;
|
||||
KA -.-> U1;
|
||||
LA -.-> U1;
|
||||
MA -.-> U1;
|
||||
|
||||
PA -.-> U2;
|
||||
QA -.-> U2;
|
||||
|
||||
classDef context fill:#f9f,stroke:#333,stroke-width:2px;
|
||||
class DA,EA,FA,IA,JA,KA,LA,MA,NA,OA,PA,QA context;
|
||||
7
main.py
7
main.py
@ -21,6 +21,11 @@ from PySide6.QtCore import Qt
|
||||
from PySide6.QtWidgets import QApplication
|
||||
|
||||
# --- Backend Imports ---
|
||||
# Add current directory to sys.path for direct execution
|
||||
import sys
|
||||
import os
|
||||
sys.path.append(os.path.dirname(__file__))
|
||||
|
||||
try:
|
||||
from configuration import Configuration, ConfigurationError
|
||||
from processing_engine import ProcessingEngine
|
||||
@ -29,6 +34,8 @@ try:
|
||||
from utils.workspace_utils import prepare_processing_workspace
|
||||
except ImportError as e:
|
||||
script_dir = Path(__file__).parent.resolve()
|
||||
print(f"ERROR: Cannot import Configuration or rule_structure classes.")
|
||||
print(f"Ensure configuration.py and rule_structure.py are in the same directory or Python path.")
|
||||
print(f"ERROR: Failed to import necessary classes: {e}")
|
||||
print(f"Ensure 'configuration.py' and 'asset_processor.py' exist in the directory:")
|
||||
print(f" {script_dir}")
|
||||
|
||||
24
processing/pipeline/asset_context.py
Normal file
24
processing/pipeline/asset_context.py
Normal file
@ -0,0 +1,24 @@
|
||||
from dataclasses import dataclass
|
||||
from pathlib import Path
|
||||
from typing import Dict, List, Optional
|
||||
|
||||
from rule_structure import AssetRule, FileRule, SourceRule
|
||||
from configuration import Configuration
|
||||
|
||||
@dataclass
|
||||
class AssetProcessingContext:
|
||||
source_rule: SourceRule
|
||||
asset_rule: AssetRule
|
||||
workspace_path: Path
|
||||
engine_temp_dir: Path
|
||||
output_base_path: Path
|
||||
effective_supplier: Optional[str]
|
||||
asset_metadata: Dict
|
||||
processed_maps_details: Dict[str, Dict[str, Dict]]
|
||||
merged_maps_details: Dict[str, Dict[str, Dict]]
|
||||
files_to_process: List[FileRule]
|
||||
loaded_data_cache: Dict
|
||||
config_obj: Configuration
|
||||
status_flags: Dict
|
||||
incrementing_value: Optional[str]
|
||||
sha5_value: Optional[str]
|
||||
133
processing/pipeline/orchestrator.py
Normal file
133
processing/pipeline/orchestrator.py
Normal file
@ -0,0 +1,133 @@
|
||||
from typing import List, Dict, Optional
|
||||
from pathlib import Path
|
||||
import shutil
|
||||
import tempfile
|
||||
import logging
|
||||
|
||||
from configuration import Configuration
|
||||
from rule_structure import SourceRule, AssetRule
|
||||
from .asset_context import AssetProcessingContext
|
||||
from .stages.base_stage import ProcessingStage
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
class PipelineOrchestrator:
|
||||
"""
|
||||
Orchestrates the processing of assets based on source rules and a series of processing stages.
|
||||
"""
|
||||
|
||||
def __init__(self, config_obj: Configuration, stages: List[ProcessingStage]):
|
||||
"""
|
||||
Initializes the PipelineOrchestrator.
|
||||
|
||||
Args:
|
||||
config_obj: The main configuration object.
|
||||
stages: A list of processing stages to be executed in order.
|
||||
"""
|
||||
self.config_obj: Configuration = config_obj
|
||||
self.stages: List[ProcessingStage] = stages
|
||||
|
||||
def process_source_rule(
|
||||
self,
|
||||
source_rule: SourceRule,
|
||||
workspace_path: Path,
|
||||
output_base_path: Path,
|
||||
overwrite: bool, # Not used in this initial implementation, but part of the signature
|
||||
incrementing_value: Optional[str],
|
||||
sha5_value: Optional[str] # Corrected from sha5_value to sha256_value as per typical usage, assuming typo
|
||||
) -> Dict[str, List[str]]:
|
||||
"""
|
||||
Processes a single source rule, iterating through its asset rules and applying all stages.
|
||||
|
||||
Args:
|
||||
source_rule: The source rule to process.
|
||||
workspace_path: The base path of the workspace.
|
||||
output_base_path: The base path for output files.
|
||||
overwrite: Whether to overwrite existing files (not fully implemented yet).
|
||||
incrementing_value: An optional incrementing value for versioning or naming.
|
||||
sha5_value: An optional SHA5 hash value for the asset (assuming typo, likely sha256).
|
||||
|
||||
Returns:
|
||||
A dictionary summarizing the processing status of assets.
|
||||
"""
|
||||
overall_status: Dict[str, List[str]] = {
|
||||
"processed": [],
|
||||
"skipped": [],
|
||||
"failed": [],
|
||||
}
|
||||
engine_temp_dir_path: Optional[Path] = None # Initialize to None
|
||||
|
||||
try:
|
||||
# Create a temporary directory for this processing run if needed by any stage
|
||||
# This temp dir is for the entire source_rule processing, not per asset.
|
||||
# Individual stages might create their own sub-temp dirs if necessary.
|
||||
temp_dir_path_str = tempfile.mkdtemp(
|
||||
prefix="asset_processor_orchestrator_temp_", dir=self.config_obj.get_temp_directory_base()
|
||||
)
|
||||
engine_temp_dir_path = Path(temp_dir_path_str)
|
||||
log.debug(f"PipelineOrchestrator created temporary directory: {engine_temp_dir_path}")
|
||||
|
||||
|
||||
for asset_rule in source_rule.assets:
|
||||
log.debug(f"Orchestrator: Processing asset '{asset_rule.name}'")
|
||||
context = AssetProcessingContext(
|
||||
source_rule=source_rule,
|
||||
asset_rule=asset_rule,
|
||||
workspace_path=workspace_path, # This is the path to the source files (e.g. extracted archive)
|
||||
engine_temp_dir=engine_temp_dir_path, # Pass the orchestrator's temp dir
|
||||
output_base_path=output_base_path,
|
||||
effective_supplier=None, # Will be set by SupplierDeterminationStage
|
||||
asset_metadata={}, # Will be populated by stages
|
||||
processed_maps_details={}, # Will be populated by stages
|
||||
merged_maps_details={}, # Will be populated by stages
|
||||
files_to_process=[], # Will be populated by FileRuleFilterStage
|
||||
loaded_data_cache={}, # For image loading cache within this asset's processing
|
||||
config_obj=self.config_obj,
|
||||
status_flags={"skip_asset": False, "asset_failed": False}, # Initialize common flags
|
||||
incrementing_value=incrementing_value,
|
||||
sha256_value=sha5_value # Parameter name in context is sha256_value
|
||||
)
|
||||
|
||||
for stage_idx, stage in enumerate(self.stages):
|
||||
log.debug(f"Asset '{asset_rule.name}': Executing stage {stage_idx + 1}/{len(self.stages)}: {stage.__class__.__name__}")
|
||||
try:
|
||||
context = stage.execute(context)
|
||||
except Exception as e:
|
||||
log.error(f"Asset '{asset_rule.name}': Error during stage '{stage.__class__.__name__}': {e}", exc_info=True)
|
||||
context.status_flags["asset_failed"] = True
|
||||
context.asset_metadata["status"] = f"Failed: Error in stage {stage.__class__.__name__}"
|
||||
context.asset_metadata["error_message"] = str(e)
|
||||
break # Stop processing stages for this asset on error
|
||||
|
||||
if context.status_flags.get("skip_asset"):
|
||||
log.info(f"Asset '{asset_rule.name}': Skipped by stage '{stage.__class__.__name__}'. Reason: {context.status_flags.get('skip_reason', 'N/A')}")
|
||||
break # Skip remaining stages for this asset
|
||||
|
||||
# Refined status collection
|
||||
if context.status_flags.get('skip_asset'):
|
||||
overall_status["skipped"].append(asset_rule.name)
|
||||
elif context.status_flags.get('asset_failed') or str(context.asset_metadata.get('status', '')).startswith("Failed"):
|
||||
overall_status["failed"].append(asset_rule.name)
|
||||
elif context.asset_metadata.get('status') == "Processed":
|
||||
overall_status["processed"].append(asset_rule.name)
|
||||
else: # Default or unknown state
|
||||
log.warning(f"Asset '{asset_rule.name}': Unknown status after pipeline execution. Metadata status: '{context.asset_metadata.get('status')}'. Marking as failed.")
|
||||
overall_status["failed"].append(f"{asset_rule.name} (Unknown Status: {context.asset_metadata.get('status')})")
|
||||
log.debug(f"Asset '{asset_rule.name}' final status: {context.asset_metadata.get('status', 'N/A')}, Flags: {context.status_flags}")
|
||||
|
||||
except Exception as e:
|
||||
log.error(f"PipelineOrchestrator.process_source_rule failed: {e}", exc_info=True)
|
||||
# Mark all remaining assets as failed if a top-level error occurs
|
||||
processed_or_skipped_or_failed = set(overall_status["processed"] + overall_status["skipped"] + overall_status["failed"])
|
||||
for asset_rule in source_rule.assets:
|
||||
if asset_rule.name not in processed_or_skipped_or_failed:
|
||||
overall_status["failed"].append(f"{asset_rule.name} (Orchestrator Error)")
|
||||
finally:
|
||||
if engine_temp_dir_path and engine_temp_dir_path.exists():
|
||||
try:
|
||||
log.debug(f"PipelineOrchestrator cleaning up temporary directory: {engine_temp_dir_path}")
|
||||
shutil.rmtree(engine_temp_dir_path, ignore_errors=True)
|
||||
except Exception as e:
|
||||
log.error(f"Error cleaning up orchestrator temporary directory {engine_temp_dir_path}: {e}", exc_info=True)
|
||||
|
||||
return overall_status
|
||||
175
processing/pipeline/stages/alpha_extraction_to_mask.py
Normal file
175
processing/pipeline/stages/alpha_extraction_to_mask.py
Normal file
@ -0,0 +1,175 @@
|
||||
import logging
|
||||
import uuid
|
||||
from pathlib import Path
|
||||
from typing import List, Optional, Dict
|
||||
|
||||
import numpy as np
|
||||
|
||||
from .base_stage import ProcessingStage
|
||||
from ..asset_context import AssetProcessingContext
|
||||
from ...utils import image_processing_utils as ipu
|
||||
from .....rule_structure import FileRule, TransformSettings
|
||||
from .....utils.path_utils import sanitize_filename
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
class AlphaExtractionToMaskStage(ProcessingStage):
|
||||
"""
|
||||
Extracts an alpha channel from a suitable source map (e.g., Albedo, Diffuse)
|
||||
to generate a MASK map if one is not explicitly defined.
|
||||
"""
|
||||
SUITABLE_SOURCE_MAP_TYPES = ["ALBEDO", "DIFFUSE", "BASE_COLOR"] # Map types likely to have alpha
|
||||
|
||||
def execute(self, context: AssetProcessingContext) -> AssetProcessingContext:
|
||||
logger.debug(f"Asset '{context.asset_rule.name}': Running AlphaExtractionToMaskStage.")
|
||||
|
||||
if context.status_flags.get('skip_asset'):
|
||||
logger.debug(f"Asset '{context.asset_rule.name}': Skipping due to 'skip_asset' flag.")
|
||||
return context
|
||||
|
||||
if not context.files_to_process or not context.processed_maps_details:
|
||||
logger.debug(
|
||||
f"Asset '{context.asset_rule.name}': Skipping alpha extraction - "
|
||||
f"no files to process or no processed map details."
|
||||
)
|
||||
return context
|
||||
|
||||
# A. Check for Existing MASK Map
|
||||
for file_rule in context.files_to_process:
|
||||
if file_rule.map_type == "MASK":
|
||||
logger.info(
|
||||
f"Asset '{context.asset_rule.name}': MASK map already defined by FileRule "
|
||||
f"'{file_rule.filename_pattern}'. Skipping alpha extraction."
|
||||
)
|
||||
return context
|
||||
|
||||
# B. Find Suitable Source Map with Alpha
|
||||
source_map_details_for_alpha: Optional[Dict] = None
|
||||
source_file_rule_id_for_alpha: Optional[str] = None
|
||||
|
||||
for file_rule_id, details in context.processed_maps_details.items():
|
||||
if details.get('status') == 'Processed' and \
|
||||
details.get('map_type') in self.SUITABLE_SOURCE_MAP_TYPES:
|
||||
try:
|
||||
temp_path = Path(details['temp_processed_file'])
|
||||
if not temp_path.exists():
|
||||
logger.warning(
|
||||
f"Asset '{context.asset_rule.name}': Temp file {temp_path} for map "
|
||||
f"{details['map_type']} (ID: {file_rule_id}) does not exist. Cannot check for alpha."
|
||||
)
|
||||
continue
|
||||
|
||||
# Load image header or minimal data to check for alpha if possible,
|
||||
# otherwise load full image. ipu.load_image should handle this.
|
||||
image_data = ipu.load_image(temp_path)
|
||||
|
||||
if image_data is not None and image_data.ndim == 3 and image_data.shape[2] == 4:
|
||||
source_map_details_for_alpha = details
|
||||
source_file_rule_id_for_alpha = file_rule_id
|
||||
logger.info(
|
||||
f"Asset '{context.asset_rule.name}': Found potential source for alpha extraction: "
|
||||
f"{temp_path} (MapType: {details['map_type']})"
|
||||
)
|
||||
break
|
||||
except Exception as e:
|
||||
logger.warning(
|
||||
f"Asset '{context.asset_rule.name}': Error checking alpha for {details.get('temp_processed_file', 'N/A')}: {e}"
|
||||
)
|
||||
continue
|
||||
|
||||
|
||||
if source_map_details_for_alpha is None or source_file_rule_id_for_alpha is None:
|
||||
logger.info(
|
||||
f"Asset '{context.asset_rule.name}': No suitable source map with alpha channel found "
|
||||
f"for MASK extraction."
|
||||
)
|
||||
return context
|
||||
|
||||
# C. Extract Alpha Channel
|
||||
source_image_path = Path(source_map_details_for_alpha['temp_processed_file'])
|
||||
full_image_data = ipu.load_image(source_image_path) # Reload to ensure we have the original RGBA
|
||||
|
||||
if full_image_data is None or not (full_image_data.ndim == 3 and full_image_data.shape[2] == 4):
|
||||
logger.error(
|
||||
f"Asset '{context.asset_rule.name}': Failed to reload or verify alpha channel from "
|
||||
f"{source_image_path} for MASK extraction."
|
||||
)
|
||||
return context
|
||||
|
||||
alpha_channel: np.ndarray = full_image_data[:, :, 3] # Extract alpha (0-255)
|
||||
|
||||
# D. Save New Temporary MASK Map
|
||||
# Ensure the mask is a 2D grayscale image. If ipu.save_image expects 3 channels for grayscale, adapt.
|
||||
# Assuming ipu.save_image can handle a 2D numpy array for a grayscale image.
|
||||
if alpha_channel.ndim == 2: # Expected
|
||||
pass
|
||||
elif alpha_channel.ndim == 3 and alpha_channel.shape[2] == 1: # (H, W, 1)
|
||||
alpha_channel = alpha_channel.squeeze(axis=2)
|
||||
else:
|
||||
logger.error(
|
||||
f"Asset '{context.asset_rule.name}': Extracted alpha channel has unexpected dimensions: "
|
||||
f"{alpha_channel.shape}. Cannot save."
|
||||
)
|
||||
return context
|
||||
|
||||
mask_temp_filename = (
|
||||
f"mask_from_alpha_{sanitize_filename(source_map_details_for_alpha['map_type'])}"
|
||||
f"_{source_file_rule_id_for_alpha}{source_image_path.suffix}"
|
||||
)
|
||||
mask_temp_path = context.engine_temp_dir / mask_temp_filename
|
||||
|
||||
save_success = ipu.save_image(mask_temp_path, alpha_channel)
|
||||
|
||||
if not save_success:
|
||||
logger.error(
|
||||
f"Asset '{context.asset_rule.name}': Failed to save extracted alpha mask to {mask_temp_path}."
|
||||
)
|
||||
return context
|
||||
|
||||
logger.info(
|
||||
f"Asset '{context.asset_rule.name}': Extracted alpha and saved as new MASK map: {mask_temp_path}"
|
||||
)
|
||||
|
||||
# E. Create New FileRule for the MASK and Update Context
|
||||
new_mask_file_rule_id_obj = uuid.uuid4()
|
||||
new_mask_file_rule_id_str = str(new_mask_file_rule_id_obj) # Use string for FileRule.id
|
||||
new_mask_file_rule_id_hex = new_mask_file_rule_id_obj.hex # Use hex for dict key
|
||||
|
||||
new_mask_file_rule = FileRule(
|
||||
id=new_mask_file_rule_id_str,
|
||||
map_type="MASK",
|
||||
filename_pattern=mask_temp_path.name, # Pattern matches the generated temp file
|
||||
item_type="MAP_COL", # Considered a collected map post-generation
|
||||
active=True,
|
||||
transform_settings=TransformSettings(), # Default transform settings
|
||||
source_map_ids_for_generation=[source_file_rule_id_for_alpha] # Link to original source
|
||||
# Ensure other necessary FileRule fields are defaulted or set if required
|
||||
)
|
||||
|
||||
context.files_to_process.append(new_mask_file_rule)
|
||||
|
||||
original_dims = source_map_details_for_alpha.get('original_dimensions')
|
||||
if original_dims is None and full_image_data is not None: # Fallback if not in details
|
||||
original_dims = (full_image_data.shape[1], full_image_data.shape[0])
|
||||
|
||||
|
||||
context.processed_maps_details[new_mask_file_rule_id_hex] = {
|
||||
'map_type': "MASK",
|
||||
'source_file': str(source_image_path), # Original RGBA map path
|
||||
'temp_processed_file': str(mask_temp_path), # Path to the new MASK map
|
||||
'original_dimensions': original_dims, # Dimensions of the source image
|
||||
'processed_dimensions': (alpha_channel.shape[1], alpha_channel.shape[0]), # Dimensions of MASK
|
||||
'status': 'Processed', # This map is now considered processed
|
||||
'notes': (
|
||||
f"Generated from alpha of {source_map_details_for_alpha['map_type']} "
|
||||
f"(Source Rule ID: {source_file_rule_id_for_alpha})"
|
||||
),
|
||||
'file_rule_id': new_mask_file_rule_id_str # Link back to the new FileRule ID
|
||||
}
|
||||
|
||||
logger.info(
|
||||
f"Asset '{context.asset_rule.name}': Added new FileRule for generated MASK "
|
||||
f"(ID: {new_mask_file_rule_id_str}) and updated processed_maps_details."
|
||||
)
|
||||
|
||||
return context
|
||||
48
processing/pipeline/stages/asset_skip_logic.py
Normal file
48
processing/pipeline/stages/asset_skip_logic.py
Normal file
@ -0,0 +1,48 @@
|
||||
import logging
|
||||
from ..base_stage import ProcessingStage
|
||||
from ...asset_context import AssetProcessingContext
|
||||
|
||||
class AssetSkipLogicStage(ProcessingStage):
|
||||
"""
|
||||
Processing stage to determine if an asset should be skipped based on various conditions.
|
||||
"""
|
||||
def execute(self, context: AssetProcessingContext) -> AssetProcessingContext:
|
||||
"""
|
||||
Executes the asset skip logic.
|
||||
|
||||
Args:
|
||||
context: The asset processing context.
|
||||
|
||||
Returns:
|
||||
The updated asset processing context.
|
||||
"""
|
||||
context.status_flags['skip_asset'] = False # Initialize/reset skip flag
|
||||
|
||||
# 1. Check for Supplier Error
|
||||
# Assuming 'supplier_error' might be set by a previous stage (e.g., SupplierDeterminationStage)
|
||||
# or if effective_supplier is None after attempts to determine it.
|
||||
if context.effective_supplier is None or context.status_flags.get('supplier_error', False):
|
||||
logging.info(f"Asset '{context.asset_rule.name}': Skipping due to missing or invalid supplier.")
|
||||
context.status_flags['skip_asset'] = True
|
||||
context.status_flags['skip_reason'] = "Invalid or missing supplier"
|
||||
return context
|
||||
|
||||
# 2. Check asset_rule.process_status
|
||||
if context.asset_rule.process_status == "SKIP":
|
||||
logging.info(f"Asset '{context.asset_rule.name}': Skipping as per process_status 'SKIP'.")
|
||||
context.status_flags['skip_asset'] = True
|
||||
context.status_flags['skip_reason'] = "Process status set to SKIP"
|
||||
return context
|
||||
|
||||
if context.asset_rule.process_status == "PROCESSED" and \
|
||||
not context.config_obj.general_settings.overwrite_existing:
|
||||
logging.info(
|
||||
f"Asset '{context.asset_rule.name}': Skipping as it's already 'PROCESSED' "
|
||||
f"and overwrite is disabled."
|
||||
)
|
||||
context.status_flags['skip_asset'] = True
|
||||
context.status_flags['skip_reason'] = "Already processed, overwrite disabled"
|
||||
return context
|
||||
|
||||
# If none of the above conditions are met, skip_asset remains False.
|
||||
return context
|
||||
22
processing/pipeline/stages/base_stage.py
Normal file
22
processing/pipeline/stages/base_stage.py
Normal file
@ -0,0 +1,22 @@
|
||||
from abc import ABC, abstractmethod
|
||||
|
||||
from ..asset_context import AssetProcessingContext
|
||||
|
||||
|
||||
class ProcessingStage(ABC):
|
||||
"""
|
||||
Abstract base class for a stage in the asset processing pipeline.
|
||||
"""
|
||||
|
||||
@abstractmethod
|
||||
def execute(self, context: AssetProcessingContext) -> AssetProcessingContext:
|
||||
"""
|
||||
Executes the processing logic of this stage.
|
||||
|
||||
Args:
|
||||
context: The current asset processing context.
|
||||
|
||||
Returns:
|
||||
The updated asset processing context.
|
||||
"""
|
||||
pass
|
||||
80
processing/pipeline/stages/file_rule_filter.py
Normal file
80
processing/pipeline/stages/file_rule_filter.py
Normal file
@ -0,0 +1,80 @@
|
||||
import logging
|
||||
import fnmatch
|
||||
from typing import List, Set
|
||||
|
||||
from ..base_stage import ProcessingStage
|
||||
from ...asset_context import AssetProcessingContext
|
||||
from .....rule_structure import FileRule
|
||||
|
||||
|
||||
class FileRuleFilterStage(ProcessingStage):
|
||||
"""
|
||||
Determines which FileRules associated with an AssetRule should be processed.
|
||||
Populates context.files_to_process, respecting FILE_IGNORE rules.
|
||||
"""
|
||||
|
||||
def execute(self, context: AssetProcessingContext) -> AssetProcessingContext:
|
||||
"""
|
||||
Executes the file rule filtering logic.
|
||||
|
||||
Args:
|
||||
context: The AssetProcessingContext for the current asset.
|
||||
|
||||
Returns:
|
||||
The modified AssetProcessingContext.
|
||||
"""
|
||||
if context.status_flags.get('skip_asset'):
|
||||
logging.debug(f"Asset '{context.asset_rule.name}': Skipping FileRuleFilterStage due to 'skip_asset' flag.")
|
||||
return context
|
||||
|
||||
context.files_to_process: List[FileRule] = []
|
||||
ignore_patterns: Set[str] = set()
|
||||
|
||||
# Step 1: Collect all FILE_IGNORE patterns
|
||||
if context.asset_rule and context.asset_rule.file_rules:
|
||||
for file_rule in context.asset_rule.file_rules:
|
||||
if file_rule.item_type == "FILE_IGNORE" and file_rule.active:
|
||||
ignore_patterns.add(file_rule.filename_pattern)
|
||||
logging.debug(
|
||||
f"Asset '{context.asset_rule.name}': Registering ignore pattern: '{file_rule.filename_pattern}'"
|
||||
)
|
||||
else:
|
||||
logging.debug(f"Asset '{context.asset_rule.name if context.asset_rule else 'Unknown'}': No file rules to process or asset_rule is None.")
|
||||
# Still need to return context even if there are no rules
|
||||
logging.info(f"Asset '{context.asset_rule.name if context.asset_rule else 'Unknown'}': 0 file rules queued for processing after filtering.")
|
||||
return context
|
||||
|
||||
|
||||
# Step 2: Filter and add processable FileRules
|
||||
for file_rule in context.asset_rule.file_rules:
|
||||
if not file_rule.active:
|
||||
logging.debug(
|
||||
f"Asset '{context.asset_rule.name}': Skipping inactive file rule '{file_rule.filename_pattern}'."
|
||||
)
|
||||
continue
|
||||
|
||||
if file_rule.item_type == "FILE_IGNORE":
|
||||
# Already processed, skip.
|
||||
continue
|
||||
|
||||
is_ignored = False
|
||||
for ignore_pat in ignore_patterns:
|
||||
if fnmatch.fnmatch(file_rule.filename_pattern, ignore_pat):
|
||||
is_ignored = True
|
||||
logging.debug(
|
||||
f"Asset '{context.asset_rule.name}': Skipping file rule '{file_rule.filename_pattern}' "
|
||||
f"due to matching ignore pattern '{ignore_pat}'."
|
||||
)
|
||||
break
|
||||
|
||||
if not is_ignored:
|
||||
context.files_to_process.append(file_rule)
|
||||
logging.debug(
|
||||
f"Asset '{context.asset_rule.name}': Adding file rule '{file_rule.filename_pattern}' "
|
||||
f"(type: {file_rule.item_type}) to processing queue."
|
||||
)
|
||||
|
||||
logging.info(
|
||||
f"Asset '{context.asset_rule.name}': {len(context.files_to_process)} file rules queued for processing after filtering."
|
||||
)
|
||||
return context
|
||||
156
processing/pipeline/stages/gloss_to_rough_conversion.py
Normal file
156
processing/pipeline/stages/gloss_to_rough_conversion.py
Normal file
@ -0,0 +1,156 @@
|
||||
import logging
|
||||
from pathlib import Path
|
||||
import numpy as np
|
||||
from typing import List
|
||||
|
||||
from .base_stage import ProcessingStage
|
||||
from ..asset_context import AssetProcessingContext
|
||||
from ...rule_structure import FileRule
|
||||
from ..utils import image_processing_utils as ipu
|
||||
from ...utils.path_utils import sanitize_filename
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
class GlossToRoughConversionStage(ProcessingStage):
|
||||
"""
|
||||
Processing stage to convert glossiness maps to roughness maps.
|
||||
Iterates through FileRules, identifies GLOSS maps, loads their
|
||||
corresponding temporary processed images, inverts them, and saves
|
||||
them as new temporary ROUGHNESS maps. Updates the FileRule and
|
||||
context.processed_maps_details accordingly.
|
||||
"""
|
||||
|
||||
def execute(self, context: AssetProcessingContext) -> AssetProcessingContext:
|
||||
"""
|
||||
Executes the gloss to roughness conversion logic.
|
||||
|
||||
Args:
|
||||
context: The AssetProcessingContext containing asset and processing details.
|
||||
|
||||
Returns:
|
||||
The updated AssetProcessingContext.
|
||||
"""
|
||||
if context.status_flags.get('skip_asset'):
|
||||
logger.debug(f"Asset '{context.asset_rule.name}': Skipping GlossToRoughConversionStage due to skip_asset flag.")
|
||||
return context
|
||||
|
||||
if not context.files_to_process or not context.processed_maps_details:
|
||||
logger.debug(
|
||||
f"Asset '{context.asset_rule.name}': No files to process or processed_maps_details empty "
|
||||
f"in GlossToRoughConversionStage. Skipping."
|
||||
)
|
||||
return context
|
||||
|
||||
new_files_to_process: List[FileRule] = []
|
||||
processed_a_gloss_map = False
|
||||
|
||||
logger.info(f"Asset '{context.asset_rule.name}': Starting Gloss to Roughness Conversion Stage.")
|
||||
|
||||
for idx, file_rule in enumerate(context.files_to_process):
|
||||
if file_rule.map_type == "GLOSS":
|
||||
map_detail_key = file_rule.id.hex
|
||||
if map_detail_key not in context.processed_maps_details:
|
||||
logger.warning(
|
||||
f"Asset '{context.asset_rule.name}': GLOSS map '{file_rule.source_file_path}' "
|
||||
f"(ID: {map_detail_key}) found in files_to_process but not in processed_maps_details. "
|
||||
f"Adding original rule and skipping conversion for this map."
|
||||
)
|
||||
new_files_to_process.append(file_rule)
|
||||
continue
|
||||
|
||||
map_details = context.processed_maps_details[map_detail_key]
|
||||
|
||||
if map_details.get('status') != 'Processed' or 'temp_processed_file' not in map_details:
|
||||
logger.warning(
|
||||
f"Asset '{context.asset_rule.name}': GLOSS map '{file_rule.source_file_path}' "
|
||||
f"(ID: {map_detail_key}) not successfully processed by previous stage or temp file missing. "
|
||||
f"Status: {map_details.get('status')}. Adding original rule and skipping conversion."
|
||||
)
|
||||
new_files_to_process.append(file_rule)
|
||||
continue
|
||||
|
||||
original_temp_path_str = map_details['temp_processed_file']
|
||||
original_temp_path = Path(original_temp_path_str)
|
||||
|
||||
if not original_temp_path.exists():
|
||||
logger.error(
|
||||
f"Asset '{context.asset_rule.name}': Temporary file {original_temp_path_str} for GLOSS map "
|
||||
f"(ID: {map_detail_key}) does not exist. Adding original rule and skipping conversion."
|
||||
)
|
||||
new_files_to_process.append(file_rule)
|
||||
continue
|
||||
|
||||
logger.debug(f"Asset '{context.asset_rule.name}': Processing GLOSS map {original_temp_path} for conversion.")
|
||||
image_data = ipu.load_image(original_temp_path)
|
||||
|
||||
if image_data is None:
|
||||
logger.error(
|
||||
f"Asset '{context.asset_rule.name}': Failed to load image data from {original_temp_path} "
|
||||
f"for GLOSS map (ID: {map_detail_key}). Adding original rule and skipping conversion."
|
||||
)
|
||||
new_files_to_process.append(file_rule)
|
||||
continue
|
||||
|
||||
# Perform Inversion
|
||||
inverted_image_data: np.ndarray
|
||||
if np.issubdtype(image_data.dtype, np.floating):
|
||||
inverted_image_data = 1.0 - image_data
|
||||
inverted_image_data = np.clip(inverted_image_data, 0.0, 1.0) # Ensure range for floats
|
||||
logger.debug(f"Asset '{context.asset_rule.name}': Inverted float image data for {original_temp_path}.")
|
||||
elif np.issubdtype(image_data.dtype, np.integer):
|
||||
max_val = np.iinfo(image_data.dtype).max
|
||||
inverted_image_data = max_val - image_data
|
||||
logger.debug(f"Asset '{context.asset_rule.name}': Inverted integer image data (max_val: {max_val}) for {original_temp_path}.")
|
||||
else:
|
||||
logger.error(
|
||||
f"Asset '{context.asset_rule.name}': Unsupported image data type {image_data.dtype} "
|
||||
f"for GLOSS map {original_temp_path}. Cannot invert. Adding original rule."
|
||||
)
|
||||
new_files_to_process.append(file_rule)
|
||||
continue
|
||||
|
||||
# Save New Temporary (Roughness) Map
|
||||
# Using original_temp_path.suffix ensures we keep the format (e.g., .png, .exr)
|
||||
new_temp_filename = f"rough_from_gloss_{sanitize_filename(file_rule.map_type)}_{file_rule.id.hex}{original_temp_path.suffix}"
|
||||
new_temp_path = context.engine_temp_dir / new_temp_filename
|
||||
|
||||
save_success = ipu.save_image(new_temp_path, inverted_image_data)
|
||||
|
||||
if save_success:
|
||||
logger.info(
|
||||
f"Asset '{context.asset_rule.name}': Converted GLOSS map {original_temp_path} "
|
||||
f"to ROUGHNESS map {new_temp_path}."
|
||||
)
|
||||
|
||||
modified_file_rule = file_rule.model_copy(deep=True)
|
||||
modified_file_rule.map_type = "ROUGHNESS"
|
||||
|
||||
# Update context.processed_maps_details for the original file_rule.id.hex
|
||||
context.processed_maps_details[map_detail_key]['temp_processed_file'] = str(new_temp_path)
|
||||
context.processed_maps_details[map_detail_key]['original_map_type_before_conversion'] = "GLOSS"
|
||||
context.processed_maps_details[map_detail_key]['notes'] = "Converted from GLOSS by GlossToRoughConversionStage"
|
||||
|
||||
new_files_to_process.append(modified_file_rule)
|
||||
processed_a_gloss_map = True
|
||||
else:
|
||||
logger.error(
|
||||
f"Asset '{context.asset_rule.name}': Failed to save inverted ROUGHNESS map to {new_temp_path} "
|
||||
f"for original GLOSS map (ID: {map_detail_key}). Adding original rule."
|
||||
)
|
||||
new_files_to_process.append(file_rule)
|
||||
else: # Not a gloss map
|
||||
new_files_to_process.append(file_rule)
|
||||
|
||||
context.files_to_process = new_files_to_process
|
||||
|
||||
if processed_a_gloss_map:
|
||||
logger.info(
|
||||
f"Asset '{context.asset_rule.name}': Gloss to Roughness conversion stage successfully processed one or more maps and updated file list."
|
||||
)
|
||||
else:
|
||||
logger.debug(
|
||||
f"Asset '{context.asset_rule.name}': No gloss maps were successfully converted in GlossToRoughConversionStage. "
|
||||
f"File list for next stage contains original non-gloss maps and any gloss maps that failed conversion."
|
||||
)
|
||||
|
||||
return context
|
||||
245
processing/pipeline/stages/individual_map_processing.py
Normal file
245
processing/pipeline/stages/individual_map_processing.py
Normal file
@ -0,0 +1,245 @@
|
||||
import os
|
||||
import logging
|
||||
from pathlib import Path
|
||||
from typing import Optional, Tuple, Dict
|
||||
|
||||
import cv2
|
||||
import numpy as np
|
||||
|
||||
from ..base_stage import ProcessingStage
|
||||
from ..asset_context import AssetProcessingContext
|
||||
from ....rule_structure import FileRule, TransformSettings
|
||||
from ....utils.path_utils import sanitize_filename
|
||||
from ...utils import image_processing_utils as ipu
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
class IndividualMapProcessingStage(ProcessingStage):
|
||||
"""
|
||||
Processes individual texture map files based on FileRules.
|
||||
This stage finds the source file, loads it, applies transformations
|
||||
(resize, color space), saves a temporary processed version, and updates
|
||||
the AssetProcessingContext with details.
|
||||
"""
|
||||
|
||||
def execute(self, context: AssetProcessingContext) -> AssetProcessingContext:
|
||||
"""
|
||||
Executes the individual map processing logic.
|
||||
"""
|
||||
if context.status_flags.get('skip_asset', False):
|
||||
logger.info(f"Asset {context.asset_id}: Skipping individual map processing due to skip_asset flag.")
|
||||
return context
|
||||
|
||||
if not hasattr(context, 'processed_maps_details') or context.processed_maps_details is None:
|
||||
context.processed_maps_details = {}
|
||||
logger.debug(f"Asset {context.asset_id}: Initialized processed_maps_details.")
|
||||
|
||||
if not context.files_to_process:
|
||||
logger.info(f"Asset {context.asset_id}: No files to process in this stage.")
|
||||
return context
|
||||
|
||||
source_base_path = Path(context.asset_rule.source_path)
|
||||
if not source_base_path.is_dir():
|
||||
logger.error(f"Asset {context.asset_id}: Source path '{source_base_path}' is not a valid directory. Skipping individual map processing.")
|
||||
context.status_flags['individual_map_processing_failed'] = True
|
||||
# Potentially mark all file_rules as failed if source path is invalid
|
||||
for file_rule in context.files_to_process:
|
||||
if file_rule.item_type.startswith("MAP_"): # General check for map types
|
||||
self._update_file_rule_status(context, file_rule.id.hex, 'Failed', details="Source path invalid")
|
||||
return context
|
||||
|
||||
for file_rule in context.files_to_process:
|
||||
# Primarily focus on "MAP_COL", "MAP_NORM", "MAP_ROUGH", etc.
|
||||
# For now, let's assume any item_type starting with "MAP_" is a candidate
|
||||
# unless it's specifically handled by another stage (e.g., "MAP_GEN" might be).
|
||||
# The prompt mentions "MAP_COL" primarily.
|
||||
# Let's be a bit more specific for now, focusing on types that are typically direct file mappings.
|
||||
# This can be refined based on how `item_type` is used for generated maps.
|
||||
# For now, we'll process any `FileRule` that isn't explicitly a generated map type
|
||||
# that would be handled *after* individual processing (e.g. a composite map).
|
||||
# A simple check for now:
|
||||
if not file_rule.item_type or not file_rule.item_type.startswith("MAP_") or file_rule.item_type == "MAP_GEN_COMPOSITE": # Example exclusion
|
||||
logger.debug(f"Asset {context.asset_id}, FileRule {file_rule.id.hex} ({file_rule.map_type}): Skipping, item_type '{file_rule.item_type}' not targeted for individual processing.")
|
||||
continue
|
||||
|
||||
logger.info(f"Asset {context.asset_id}, FileRule {file_rule.id.hex} ({file_rule.map_type}): Starting individual processing.")
|
||||
|
||||
# A. Find Source File
|
||||
source_file_path = self._find_source_file(source_base_path, file_rule.filename_pattern, context.asset_id, file_rule.id.hex)
|
||||
if not source_file_path:
|
||||
logger.error(f"Asset {context.asset_id}, FileRule {file_rule.id.hex} ({file_rule.map_type}): Source file not found with pattern '{file_rule.filename_pattern}' in '{source_base_path}'.")
|
||||
self._update_file_rule_status(context, file_rule.id.hex, 'Failed', map_type=file_rule.map_type, details="Source file not found")
|
||||
continue
|
||||
|
||||
# B. Load and Transform Image
|
||||
image_data: Optional[np.ndarray] = ipu.load_image(str(source_file_path))
|
||||
if image_data is None:
|
||||
logger.error(f"Asset {context.asset_id}, FileRule {file_rule.id.hex} ({file_rule.map_type}): Failed to load image from '{source_file_path}'.")
|
||||
self._update_file_rule_status(context, file_rule.id.hex, 'Failed', map_type=file_rule.map_type, source_file=str(source_file_path), details="Image load failed")
|
||||
continue
|
||||
|
||||
original_height, original_width = image_data.shape[:2]
|
||||
logger.debug(f"Asset {context.asset_id}, FileRule {file_rule.id.hex}: Loaded image '{source_file_path}' with dimensions {original_width}x{original_height}.")
|
||||
|
||||
transform: TransformSettings = file_rule.transform_settings
|
||||
|
||||
target_width, target_height = ipu.calculate_target_dimensions(
|
||||
original_width, original_height,
|
||||
transform.target_width, transform.target_height,
|
||||
transform.resize_mode,
|
||||
transform.ensure_pot,
|
||||
transform.allow_upscale
|
||||
)
|
||||
logger.debug(f"Asset {context.asset_id}, FileRule {file_rule.id.hex}: Original dims: ({original_width},{original_height}), Calculated target dims: ({target_width},{target_height})")
|
||||
|
||||
processed_image_data = image_data.copy() # Start with a copy
|
||||
|
||||
if (target_width, target_height) != (original_width, original_height):
|
||||
logger.info(f"Asset {context.asset_id}, FileRule {file_rule.id.hex}: Resizing from ({original_width},{original_height}) to ({target_width},{target_height}).")
|
||||
# Map resize_filter string to cv2 interpolation constant
|
||||
interpolation_map = {
|
||||
"NEAREST": cv2.INTER_NEAREST,
|
||||
"LINEAR": cv2.INTER_LINEAR,
|
||||
"CUBIC": cv2.INTER_CUBIC,
|
||||
"AREA": cv2.INTER_AREA, # Good for downscaling
|
||||
"LANCZOS4": cv2.INTER_LANCZOS4
|
||||
}
|
||||
interpolation = interpolation_map.get(transform.resize_filter.upper(), cv2.INTER_AREA) # Default to INTER_AREA
|
||||
processed_image_data = ipu.resize_image(processed_image_data, target_width, target_height, interpolation=interpolation)
|
||||
if processed_image_data is None: # Should not happen if resize_image handles errors, but good practice
|
||||
logger.error(f"Asset {context.asset_id}, FileRule {file_rule.id.hex} ({file_rule.map_type}): Failed to resize image.")
|
||||
self._update_file_rule_status(context, file_rule.id.hex, 'Failed', map_type=file_rule.map_type, source_file=str(source_file_path), original_dimensions=(original_width, original_height), details="Image resize failed")
|
||||
continue
|
||||
|
||||
|
||||
# Color Space Conversion (simplified)
|
||||
# Assuming ipu.load_image loads as BGR if color.
|
||||
# This needs more robust handling of source color profiles if they are known.
|
||||
if transform.color_profile_management and transform.target_color_profile == "RGB":
|
||||
if len(processed_image_data.shape) == 3 and processed_image_data.shape[2] == 3: # Check if it's a color image
|
||||
logger.info(f"Asset {context.asset_id}, FileRule {file_rule.id.hex}: Converting BGR to RGB.")
|
||||
processed_image_data = ipu.convert_bgr_to_rgb(processed_image_data)
|
||||
elif len(processed_image_data.shape) == 3 and processed_image_data.shape[2] == 4: # Check for BGRA
|
||||
logger.info(f"Asset {context.asset_id}, FileRule {file_rule.id.hex}: Converting BGRA to RGBA.")
|
||||
processed_image_data = ipu.convert_bgra_to_rgba(processed_image_data)
|
||||
|
||||
|
||||
# C. Save Temporary Processed Map
|
||||
# Ensure engine_temp_dir exists (orchestrator should handle this, but good to be safe)
|
||||
if not context.engine_temp_dir.exists():
|
||||
try:
|
||||
context.engine_temp_dir.mkdir(parents=True, exist_ok=True)
|
||||
logger.info(f"Asset {context.asset_id}: Created engine_temp_dir at '{context.engine_temp_dir}'")
|
||||
except OSError as e:
|
||||
logger.error(f"Asset {context.asset_id}: Failed to create engine_temp_dir '{context.engine_temp_dir}': {e}")
|
||||
self._update_file_rule_status(context, file_rule.id.hex, 'Failed', map_type=file_rule.map_type, source_file=str(source_file_path), details="Failed to create temp directory")
|
||||
continue # Or potentially fail the whole asset processing here
|
||||
|
||||
temp_filename_suffix = Path(source_file_path).suffix
|
||||
# Use a more descriptive name if possible, including map_type
|
||||
safe_map_type = sanitize_filename(file_rule.map_type if file_rule.map_type else "unknown_map")
|
||||
temp_output_filename = f"processed_{safe_map_type}_{file_rule.id.hex}{temp_filename_suffix}"
|
||||
temp_output_path = context.engine_temp_dir / temp_output_filename
|
||||
|
||||
# Consider output_format_settings from transform if they apply here
|
||||
# For now, save_image handles basic saving.
|
||||
# Example: cv2.imwrite params for quality for JPG, compression for PNG
|
||||
save_params = []
|
||||
if transform.output_format_settings:
|
||||
if temp_filename_suffix.lower() in ['.jpg', '.jpeg']:
|
||||
quality = transform.output_format_settings.get('quality', 95)
|
||||
save_params = [cv2.IMWRITE_JPEG_QUALITY, quality]
|
||||
elif temp_filename_suffix.lower() == '.png':
|
||||
compression = transform.output_format_settings.get('compression_level', 3) # 0-9, 3 is default
|
||||
save_params = [cv2.IMWRITE_PNG_COMPRESSION, compression]
|
||||
# Add more formats as needed (e.g., EXR, TIFF)
|
||||
|
||||
save_success = ipu.save_image(str(temp_output_path), processed_image_data, params=save_params)
|
||||
|
||||
if not save_success:
|
||||
logger.error(f"Asset {context.asset_id}, FileRule {file_rule.id.hex} ({file_rule.map_type}): Failed to save temporary image to '{temp_output_path}'.")
|
||||
self._update_file_rule_status(
|
||||
context, file_rule.id.hex, 'Failed',
|
||||
map_type=file_rule.map_type,
|
||||
source_file=str(source_file_path),
|
||||
original_dimensions=(original_width, original_height),
|
||||
processed_dimensions=(processed_image_data.shape[1], processed_image_data.shape[0]) if processed_image_data is not None else None,
|
||||
details="Temporary image save failed"
|
||||
)
|
||||
continue
|
||||
|
||||
logger.info(f"Asset {context.asset_id}, FileRule {file_rule.id.hex} ({file_rule.map_type}): Successfully processed and saved temporary map to '{temp_output_path}'.")
|
||||
|
||||
# D. Update Context
|
||||
self._update_file_rule_status(
|
||||
context, file_rule.id.hex, 'Processed',
|
||||
map_type=file_rule.map_type,
|
||||
source_file=str(source_file_path),
|
||||
temp_processed_file=str(temp_output_path),
|
||||
original_dimensions=(original_width, original_height),
|
||||
processed_dimensions=(processed_image_data.shape[1], processed_image_data.shape[0]),
|
||||
details="Successfully processed"
|
||||
)
|
||||
|
||||
# Optional: Update context.asset_metadata['processed_files']
|
||||
if 'processed_files' not in context.asset_metadata:
|
||||
context.asset_metadata['processed_files'] = []
|
||||
context.asset_metadata['processed_files'].append({
|
||||
'file_rule_id': file_rule.id.hex,
|
||||
'path': str(temp_output_path),
|
||||
'type': 'temporary_map',
|
||||
'map_type': file_rule.map_type
|
||||
})
|
||||
|
||||
|
||||
logger.info(f"Asset {context.asset_id}: Finished individual map processing stage.")
|
||||
return context
|
||||
|
||||
def _find_source_file(self, base_path: Path, pattern: str, asset_id: str, file_rule_id_hex: str) -> Optional[Path]:
|
||||
"""
|
||||
Finds a single source file matching the pattern within the base_path.
|
||||
Adapts logic from ProcessingEngine._find_source_file.
|
||||
"""
|
||||
if not pattern:
|
||||
logger.warning(f"Asset {asset_id}, FileRule {file_rule_id_hex}: Empty filename pattern provided.")
|
||||
return None
|
||||
|
||||
try:
|
||||
# Using rglob for potentially nested structures, though original might have been simpler.
|
||||
# If pattern is exact filename, it will also work.
|
||||
# If pattern is a glob, it will search.
|
||||
matched_files = list(base_path.rglob(pattern))
|
||||
|
||||
if not matched_files:
|
||||
logger.debug(f"Asset {asset_id}, FileRule {file_rule_id_hex}: No files found matching pattern '{pattern}' in '{base_path}' (recursive).")
|
||||
# Try non-recursive if rglob fails and pattern might be for top-level
|
||||
matched_files_non_recursive = list(base_path.glob(pattern))
|
||||
if matched_files_non_recursive:
|
||||
logger.debug(f"Asset {asset_id}, FileRule {file_rule_id_hex}: Found {len(matched_files_non_recursive)} files non-recursively. Using first: {matched_files_non_recursive[0]}")
|
||||
return matched_files_non_recursive[0]
|
||||
return None
|
||||
|
||||
if len(matched_files) > 1:
|
||||
logger.warning(f"Asset {asset_id}, FileRule {file_rule_id_hex}: Multiple files ({len(matched_files)}) found for pattern '{pattern}' in '{base_path}'. Using the first one: {matched_files[0]}. Files: {matched_files}")
|
||||
|
||||
return matched_files[0]
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Asset {asset_id}, FileRule {file_rule_id_hex}: Error searching for file with pattern '{pattern}' in '{base_path}': {e}")
|
||||
return None
|
||||
|
||||
def _update_file_rule_status(self, context: AssetProcessingContext, file_rule_id_hex: str, status: str, **kwargs):
|
||||
"""Helper to update processed_maps_details for a file_rule."""
|
||||
if file_rule_id_hex not in context.processed_maps_details:
|
||||
context.processed_maps_details[file_rule_id_hex] = {}
|
||||
|
||||
context.processed_maps_details[file_rule_id_hex]['status'] = status
|
||||
for key, value in kwargs.items():
|
||||
context.processed_maps_details[file_rule_id_hex][key] = value
|
||||
|
||||
# Ensure essential keys are present even on failure, if known
|
||||
if 'map_type' not in context.processed_maps_details[file_rule_id_hex] and 'map_type' in kwargs:
|
||||
context.processed_maps_details[file_rule_id_hex]['map_type'] = kwargs['map_type']
|
||||
|
||||
|
||||
logger.debug(f"Asset {context.asset_id}, FileRule {file_rule_id_hex}: Status updated to '{status}'. Details: {kwargs}")
|
||||
310
processing/pipeline/stages/map_merging.py
Normal file
310
processing/pipeline/stages/map_merging.py
Normal file
@ -0,0 +1,310 @@
|
||||
import logging
|
||||
from pathlib import Path
|
||||
from typing import Dict, Optional, List, Tuple
|
||||
|
||||
import numpy as np
|
||||
import cv2 # For potential direct cv2 operations if ipu doesn't cover all merge needs
|
||||
|
||||
from ..base_stage import ProcessingStage
|
||||
from ...asset_context import AssetProcessingContext
|
||||
from ....rule_structure import FileRule, MergeSettings, MergeInputChannel
|
||||
from ....utils.path_utils import sanitize_filename
|
||||
from ...utils import image_processing_utils as ipu
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
class MapMergingStage(ProcessingStage):
|
||||
"""
|
||||
Merges individually processed maps based on MAP_MERGE rules.
|
||||
This stage performs operations like channel packing.
|
||||
"""
|
||||
|
||||
def execute(self, context: AssetProcessingContext) -> AssetProcessingContext:
|
||||
"""
|
||||
Executes the map merging logic.
|
||||
|
||||
Args:
|
||||
context: The asset processing context.
|
||||
|
||||
Returns:
|
||||
The updated asset processing context.
|
||||
"""
|
||||
if context.status_flags.get('skip_asset'):
|
||||
logger.info(f"Skipping map merging for asset {context.asset_name} as skip_asset flag is set.")
|
||||
return context
|
||||
|
||||
if not hasattr(context, 'merged_maps_details'):
|
||||
context.merged_maps_details = {}
|
||||
|
||||
if not hasattr(context, 'processed_maps_details'):
|
||||
logger.warning(f"Asset {context.asset_name}: 'processed_maps_details' not found in context. Cannot perform map merging.")
|
||||
return context
|
||||
|
||||
if not context.files_to_process:
|
||||
logger.info(f"Asset {context.asset_name}: No files_to_process defined. Skipping map merging.")
|
||||
return context
|
||||
|
||||
logger.info(f"Starting MapMergingStage for asset: {context.asset_name}")
|
||||
|
||||
for merge_rule in context.files_to_process:
|
||||
if not isinstance(merge_rule, FileRule) or merge_rule.item_type != "MAP_MERGE":
|
||||
continue
|
||||
|
||||
if not merge_rule.merge_settings:
|
||||
logger.error(f"Asset {context.asset_name}, Rule ID {merge_rule.id.hex}: Merge rule for map_type '{merge_rule.map_type}' is missing merge_settings. Skipping this merge.")
|
||||
context.merged_maps_details[merge_rule.id.hex] = {
|
||||
'map_type': merge_rule.map_type,
|
||||
'status': 'Failed',
|
||||
'reason': 'Missing merge_settings in FileRule.'
|
||||
}
|
||||
continue
|
||||
|
||||
merge_settings: MergeSettings = merge_rule.merge_settings
|
||||
output_map_type = merge_rule.map_type
|
||||
rule_id_hex = merge_rule.id.hex
|
||||
logger.info(f"Processing MAP_MERGE rule for '{output_map_type}' (ID: {rule_id_hex})")
|
||||
|
||||
loaded_input_maps: Dict[str, np.ndarray] = {}
|
||||
input_map_paths: Dict[str, str] = {}
|
||||
target_dims: Optional[Tuple[int, int]] = None # width, height
|
||||
all_inputs_valid = True
|
||||
|
||||
# A. Load Input Maps for Merging
|
||||
if not merge_settings.input_maps:
|
||||
logger.warning(f"Asset {context.asset_name}, Rule ID {rule_id_hex}: No input_maps defined in merge_settings for '{output_map_type}'. Skipping this merge.")
|
||||
context.merged_maps_details[rule_id_hex] = {
|
||||
'map_type': output_map_type,
|
||||
'status': 'Failed',
|
||||
'reason': 'No input_maps defined in merge_settings.'
|
||||
}
|
||||
continue
|
||||
|
||||
for input_map_config in merge_settings.input_maps:
|
||||
input_rule_id_hex = input_map_config.file_rule_id.hex
|
||||
processed_detail = context.processed_maps_details.get(input_rule_id_hex)
|
||||
|
||||
if not processed_detail or processed_detail.get('status') != 'Processed':
|
||||
error_msg = f"Input map (Rule ID: {input_rule_id_hex}) for merge rule '{output_map_type}' (Rule ID: {rule_id_hex}) not found or not processed. Details: {processed_detail}"
|
||||
logger.error(error_msg)
|
||||
all_inputs_valid = False
|
||||
context.merged_maps_details[rule_id_hex] = {
|
||||
'map_type': output_map_type,
|
||||
'status': 'Failed',
|
||||
'reason': f"Input map {input_rule_id_hex} not processed or missing."
|
||||
}
|
||||
break
|
||||
|
||||
temp_processed_file_path = Path(processed_detail['temp_processed_file'])
|
||||
if not temp_processed_file_path.exists():
|
||||
error_msg = f"Input map file {temp_processed_file_path} for merge rule '{output_map_type}' (Rule ID: {rule_id_hex}) does not exist."
|
||||
logger.error(error_msg)
|
||||
all_inputs_valid = False
|
||||
context.merged_maps_details[rule_id_hex] = {
|
||||
'map_type': output_map_type,
|
||||
'status': 'Failed',
|
||||
'reason': f"Input map file {temp_processed_file_path} not found."
|
||||
}
|
||||
break
|
||||
|
||||
try:
|
||||
image_data = ipu.load_image(temp_processed_file_path)
|
||||
except Exception as e:
|
||||
logger.error(f"Error loading image {temp_processed_file_path} for merge rule '{output_map_type}' (Rule ID: {rule_id_hex}): {e}")
|
||||
all_inputs_valid = False
|
||||
context.merged_maps_details[rule_id_hex] = {
|
||||
'map_type': output_map_type,
|
||||
'status': 'Failed',
|
||||
'reason': f"Error loading input image {temp_processed_file_path}."
|
||||
}
|
||||
break
|
||||
|
||||
if image_data is None:
|
||||
logger.error(f"Failed to load image data from {temp_processed_file_path} for merge rule '{output_map_type}' (Rule ID: {rule_id_hex}).")
|
||||
all_inputs_valid = False
|
||||
context.merged_maps_details[rule_id_hex] = {
|
||||
'map_type': output_map_type,
|
||||
'status': 'Failed',
|
||||
'reason': f"Failed to load image data from {temp_processed_file_path}."
|
||||
}
|
||||
break
|
||||
|
||||
loaded_input_maps[input_rule_id_hex] = image_data
|
||||
input_map_paths[input_rule_id_hex] = str(temp_processed_file_path)
|
||||
|
||||
current_dims = (image_data.shape[1], image_data.shape[0]) # width, height
|
||||
if target_dims is None:
|
||||
target_dims = current_dims
|
||||
logger.debug(f"Merge rule '{output_map_type}' (ID: {rule_id_hex}): Set target dimensions to {target_dims} from first input {temp_processed_file_path}.")
|
||||
elif current_dims != target_dims:
|
||||
logger.warning(f"Input map {temp_processed_file_path} for merge rule '{output_map_type}' (ID: {rule_id_hex}) has dimensions {current_dims}, but target is {target_dims}. Resizing.")
|
||||
try:
|
||||
image_data = ipu.resize_image(image_data, target_dims[0], target_dims[1])
|
||||
if image_data is None:
|
||||
raise ValueError("Resize operation returned None.")
|
||||
loaded_input_maps[input_rule_id_hex] = image_data
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to resize image {temp_processed_file_path} for merge rule '{output_map_type}' (ID: {rule_id_hex}): {e}")
|
||||
all_inputs_valid = False
|
||||
context.merged_maps_details[rule_id_hex] = {
|
||||
'map_type': output_map_type,
|
||||
'status': 'Failed',
|
||||
'reason': f"Failed to resize input image {temp_processed_file_path}."
|
||||
}
|
||||
break
|
||||
|
||||
if not all_inputs_valid:
|
||||
# Failure already logged and recorded in context.merged_maps_details
|
||||
logger.warning(f"Skipping merge for '{output_map_type}' (ID: {rule_id_hex}) due to invalid inputs.")
|
||||
continue
|
||||
|
||||
if target_dims is None: # Should not happen if all_inputs_valid is true and there was at least one input map
|
||||
logger.error(f"Merge rule '{output_map_type}' (ID: {rule_id_hex}): Target dimensions not determined despite valid inputs. This indicates an issue with input map loading or an empty input_maps list that wasn't caught.")
|
||||
context.merged_maps_details[rule_id_hex] = {
|
||||
'map_type': output_map_type,
|
||||
'status': 'Failed',
|
||||
'reason': 'Target dimensions could not be determined.'
|
||||
}
|
||||
continue
|
||||
|
||||
# B. Perform Merge Operation
|
||||
try:
|
||||
if merge_settings.output_channels == 1:
|
||||
merged_image = np.zeros((target_dims[1], target_dims[0]), dtype=np.uint8)
|
||||
else:
|
||||
merged_image = np.zeros((target_dims[1], target_dims[0], merge_settings.output_channels), dtype=np.uint8)
|
||||
except Exception as e:
|
||||
logger.error(f"Error creating empty merged image for '{output_map_type}' (ID: {rule_id_hex}) with dims {target_dims} and {merge_settings.output_channels} channels: {e}")
|
||||
context.merged_maps_details[rule_id_hex] = {
|
||||
'map_type': output_map_type,
|
||||
'status': 'Failed',
|
||||
'reason': f'Error creating output image canvas: {e}'
|
||||
}
|
||||
continue
|
||||
|
||||
merge_op_failed = False
|
||||
for input_map_config in merge_settings.input_maps:
|
||||
source_image = loaded_input_maps[input_map_config.file_rule_id.hex]
|
||||
source_channel_index = input_map_config.source_channel
|
||||
target_channel_index = input_map_config.target_channel
|
||||
|
||||
source_data = None
|
||||
if source_image.ndim == 2: # Grayscale
|
||||
source_data = source_image
|
||||
elif source_image.ndim == 3: # Multi-channel (e.g. RGB, RGBA)
|
||||
if source_channel_index >= source_image.shape[2]:
|
||||
logger.error(f"Merge rule '{output_map_type}' (ID: {rule_id_hex}): Source channel index {source_channel_index} out of bounds for source image with shape {source_image.shape} (from Rule ID {input_map_config.file_rule_id.hex}).")
|
||||
merge_op_failed = True
|
||||
break
|
||||
source_data = source_image[:, :, source_channel_index]
|
||||
else:
|
||||
logger.error(f"Merge rule '{output_map_type}' (ID: {rule_id_hex}): Source image (from Rule ID {input_map_config.file_rule_id.hex}) has unexpected dimensions: {source_image.ndim}. Shape: {source_image.shape}")
|
||||
merge_op_failed = True
|
||||
break
|
||||
|
||||
if source_data is None: # Should be caught by previous checks
|
||||
logger.error(f"Merge rule '{output_map_type}' (ID: {rule_id_hex}): Failed to extract source_data for unknown reasons from input {input_map_config.file_rule_id.hex}.")
|
||||
merge_op_failed = True
|
||||
break
|
||||
|
||||
# Assign to target channel
|
||||
try:
|
||||
if merged_image.ndim == 2: # Output is grayscale
|
||||
if merge_settings.output_channels != 1:
|
||||
logger.error(f"Merge rule '{output_map_type}' (ID: {rule_id_hex}): Mismatch - merged_image is 2D but output_channels is {merge_settings.output_channels}.")
|
||||
merge_op_failed = True
|
||||
break
|
||||
merged_image = source_data # Overwrites if multiple inputs map to grayscale; consider blending or specific logic if needed
|
||||
elif merged_image.ndim == 3: # Output is multi-channel
|
||||
if target_channel_index >= merged_image.shape[2]:
|
||||
logger.error(f"Merge rule '{output_map_type}' (ID: {rule_id_hex}): Target channel index {target_channel_index} out of bounds for merged image with shape {merged_image.shape}.")
|
||||
merge_op_failed = True
|
||||
break
|
||||
merged_image[:, :, target_channel_index] = source_data
|
||||
else: # Should not happen
|
||||
logger.error(f"Merge rule '{output_map_type}' (ID: {rule_id_hex}): Merged image has unexpected dimensions: {merged_image.ndim}. Shape: {merged_image.shape}")
|
||||
merge_op_failed = True
|
||||
break
|
||||
except Exception as e:
|
||||
logger.error(f"Error assigning source data to target channel for '{output_map_type}' (ID: {rule_id_hex}): {e}. Source shape: {source_data.shape}, Target channel: {target_channel_index}, Merged image shape: {merged_image.shape}")
|
||||
merge_op_failed = True
|
||||
break
|
||||
|
||||
if input_map_config.invert_source_channel:
|
||||
if merged_image.ndim == 2:
|
||||
merged_image = 255 - merged_image # Assumes uint8
|
||||
elif merged_image.ndim == 3:
|
||||
# Ensure we are not inverting an alpha channel if that's not desired,
|
||||
# but current spec inverts the target channel data.
|
||||
merged_image[:, :, target_channel_index] = 255 - merged_image[:, :, target_channel_index]
|
||||
|
||||
# input_map_config.default_value_if_missing:
|
||||
# This was handled by all_inputs_valid check for file presence.
|
||||
# If a channel is missing from a multi-channel source, that's an error in source_channel_index.
|
||||
# If a file is entirely missing and a default color/value is needed for the *output channel*,
|
||||
# that would be a different logic, perhaps pre-filling merged_image.
|
||||
# For now, we assume if an input map is specified, it must be present and valid.
|
||||
|
||||
if merge_op_failed:
|
||||
logger.error(f"Merge operation failed for '{output_map_type}' (ID: {rule_id_hex}).")
|
||||
context.merged_maps_details[rule_id_hex] = {
|
||||
'map_type': output_map_type,
|
||||
'status': 'Failed',
|
||||
'reason': 'Error during channel packing/merge operation.'
|
||||
}
|
||||
continue
|
||||
|
||||
# C. Save Temporary Merged Map
|
||||
# Default to PNG, or use format from merge_settings if available (future enhancement)
|
||||
output_format = getattr(merge_settings, 'output_format', 'png').lower()
|
||||
if output_format not in ['png', 'jpg', 'jpeg', 'tif', 'tiff', 'exr']: # Add more as ipu supports
|
||||
logger.warning(f"Unsupported output_format '{output_format}' in merge_settings for '{output_map_type}' (ID: {rule_id_hex}). Defaulting to PNG.")
|
||||
output_format = 'png'
|
||||
|
||||
temp_merged_filename = f"merged_{sanitize_filename(output_map_type)}_{rule_id_hex}.{output_format}"
|
||||
|
||||
if not context.engine_temp_dir:
|
||||
logger.error(f"Asset {context.asset_name}: engine_temp_dir is not set. Cannot save merged map.")
|
||||
context.merged_maps_details[rule_id_hex] = {
|
||||
'map_type': output_map_type,
|
||||
'status': 'Failed',
|
||||
'reason': 'engine_temp_dir not set in context.'
|
||||
}
|
||||
continue
|
||||
|
||||
temp_merged_path = context.engine_temp_dir / temp_merged_filename
|
||||
|
||||
try:
|
||||
save_success = ipu.save_image(temp_merged_path, merged_image)
|
||||
except Exception as e:
|
||||
logger.error(f"Error saving merged image {temp_merged_path} for '{output_map_type}' (ID: {rule_id_hex}): {e}")
|
||||
save_success = False
|
||||
|
||||
if not save_success:
|
||||
logger.error(f"Failed to save temporary merged map to {temp_merged_path} for '{output_map_type}' (ID: {rule_id_hex}).")
|
||||
context.merged_maps_details[rule_id_hex] = {
|
||||
'map_type': output_map_type,
|
||||
'status': 'Failed',
|
||||
'reason': f'Failed to save merged image to {temp_merged_path}.'
|
||||
}
|
||||
continue
|
||||
|
||||
logger.info(f"Successfully merged and saved '{output_map_type}' (ID: {rule_id_hex}) to {temp_merged_path}")
|
||||
|
||||
# D. Update Context
|
||||
context.merged_maps_details[rule_id_hex] = {
|
||||
'map_type': output_map_type,
|
||||
'temp_merged_file': str(temp_merged_path),
|
||||
'input_map_ids_used': [mc.file_rule_id.hex for mc in merge_settings.input_maps],
|
||||
'input_map_files_used': input_map_paths, # Dict[rule_id_hex, path_str]
|
||||
'merged_dimensions': target_dims, # (width, height)
|
||||
'status': 'Processed',
|
||||
'file_rule_id': rule_id_hex # For easier reverse lookup if needed
|
||||
}
|
||||
|
||||
# Optional: Update context.asset_metadata['processed_files'] or similar
|
||||
# This might be better handled by a later stage that finalizes files.
|
||||
# For now, merged_maps_details is the primary record.
|
||||
|
||||
logger.info(f"Finished MapMergingStage for asset: {context.asset_name}. Merged maps: {len(context.merged_maps_details)}")
|
||||
return context
|
||||
119
processing/pipeline/stages/metadata_finalization_save.py
Normal file
119
processing/pipeline/stages/metadata_finalization_save.py
Normal file
@ -0,0 +1,119 @@
|
||||
import datetime
|
||||
import json
|
||||
import logging
|
||||
from pathlib import Path
|
||||
from typing import Any, Dict
|
||||
|
||||
from ..asset_context import AssetProcessingContext
|
||||
from .base_stage import ProcessingStage
|
||||
from ....utils.path_utils import generate_path_from_pattern
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
class MetadataFinalizationAndSaveStage(ProcessingStage):
|
||||
"""
|
||||
This stage finalizes the asset_metadata (e.g., setting processing end time,
|
||||
final status) and saves it as a JSON file.
|
||||
"""
|
||||
|
||||
def execute(self, context: AssetProcessingContext) -> AssetProcessingContext:
|
||||
"""
|
||||
Finalizes metadata, determines output path, and saves the metadata JSON file.
|
||||
"""
|
||||
if not hasattr(context, 'asset_metadata') or not context.asset_metadata:
|
||||
if context.status_flags.get('skip_asset'):
|
||||
logger.info(
|
||||
f"Asset '{context.asset_rule.name if hasattr(context, 'asset_rule') and context.asset_rule else 'Unknown'}': "
|
||||
f"Skipped before metadata initialization. No metadata file will be saved."
|
||||
)
|
||||
else:
|
||||
logger.warning(
|
||||
f"Asset '{context.asset_rule.name if hasattr(context, 'asset_rule') and context.asset_rule else 'Unknown'}': "
|
||||
f"asset_metadata not initialized. Skipping metadata finalization and save."
|
||||
)
|
||||
return context
|
||||
|
||||
# Check Skip Flag
|
||||
if context.status_flags.get('skip_asset'):
|
||||
context.asset_metadata['status'] = "Skipped"
|
||||
context.asset_metadata['processing_end_time'] = datetime.datetime.now().isoformat()
|
||||
context.asset_metadata['notes'] = context.status_flags.get('skip_reason', 'Skipped early in pipeline')
|
||||
logger.info(
|
||||
f"Asset '{context.asset_rule.name}': Marked as skipped. Reason: {context.asset_metadata['notes']}"
|
||||
)
|
||||
# Assuming we save metadata for skipped assets if it was initialized.
|
||||
# If not, the logic to skip saving would be here or before path generation.
|
||||
|
||||
# A. Finalize Metadata
|
||||
context.asset_metadata['processing_end_time'] = datetime.datetime.now().isoformat()
|
||||
|
||||
# Determine final status (if not already set to Skipped)
|
||||
if context.asset_metadata.get('status') != "Skipped":
|
||||
has_errors = any(
|
||||
context.status_flags.get(error_flag)
|
||||
for error_flag in ['file_processing_error', 'merge_error', 'critical_error'] # Added critical_error
|
||||
)
|
||||
if has_errors:
|
||||
context.asset_metadata['status'] = "Failed"
|
||||
else:
|
||||
context.asset_metadata['status'] = "Processed"
|
||||
|
||||
# Add details of processed and merged maps
|
||||
context.asset_metadata['processed_map_details'] = getattr(context, 'processed_maps_details', {})
|
||||
context.asset_metadata['merged_map_details'] = getattr(context, 'merged_maps_details', {})
|
||||
|
||||
# (Optional) Add a list of all temporary files
|
||||
context.asset_metadata['temporary_files'] = getattr(context, 'temporary_files', [])
|
||||
|
||||
# B. Determine Metadata Output Path
|
||||
# Ensure asset_rule and source_rule exist before accessing their names
|
||||
asset_name = context.asset_rule.name if hasattr(context, 'asset_rule') and context.asset_rule else "unknown_asset"
|
||||
source_rule_name = context.source_rule.name if hasattr(context, 'source_rule') and context.source_rule else "unknown_source"
|
||||
|
||||
metadata_filename = f"{asset_name}_metadata.json"
|
||||
output_path_pattern = context.asset_rule.output_path_pattern if hasattr(context, 'asset_rule') and context.asset_rule else ""
|
||||
|
||||
# Handle potential missing sha5_value, defaulting to None or an empty string
|
||||
sha_value = getattr(context, 'sha5_value', getattr(context, 'sha_value', None))
|
||||
|
||||
|
||||
full_output_path = generate_path_from_pattern(
|
||||
base_path=str(context.output_base_path), # Ensure base_path is a string
|
||||
pattern=output_path_pattern,
|
||||
asset_name=asset_name,
|
||||
map_type="metadata", # Special map_type for metadata
|
||||
filename=metadata_filename,
|
||||
source_rule_name=source_rule_name,
|
||||
incrementing_value=getattr(context, 'incrementing_value', None),
|
||||
sha_value=sha_value # Changed from sha5_value to sha_value for more generality
|
||||
)
|
||||
metadata_save_path = Path(full_output_path)
|
||||
|
||||
# C. Save Metadata File
|
||||
try:
|
||||
metadata_save_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
def make_serializable(data: Any) -> Any:
|
||||
if isinstance(data, Path):
|
||||
return str(data)
|
||||
if isinstance(data, datetime.datetime): # Ensure datetime is serializable
|
||||
return data.isoformat()
|
||||
if isinstance(data, dict):
|
||||
return {k: make_serializable(v) for k, v in data.items()}
|
||||
if isinstance(data, list):
|
||||
return [make_serializable(i) for i in data]
|
||||
return data
|
||||
|
||||
serializable_metadata = make_serializable(context.asset_metadata)
|
||||
|
||||
with open(metadata_save_path, 'w') as f:
|
||||
json.dump(serializable_metadata, f, indent=4)
|
||||
logger.info(f"Asset '{asset_name}': Metadata saved to {metadata_save_path}")
|
||||
context.asset_metadata['metadata_file_path'] = str(metadata_save_path)
|
||||
except Exception as e:
|
||||
logger.error(f"Asset '{asset_name}': Failed to save metadata to {metadata_save_path}. Error: {e}")
|
||||
context.asset_metadata['status'] = "Failed (Metadata Save Error)"
|
||||
context.status_flags['metadata_save_error'] = True
|
||||
|
||||
return context
|
||||
163
processing/pipeline/stages/metadata_initialization.py
Normal file
163
processing/pipeline/stages/metadata_initialization.py
Normal file
@ -0,0 +1,163 @@
|
||||
import datetime
|
||||
import logging
|
||||
|
||||
from ..base_stage import ProcessingStage
|
||||
from ...asset_context import AssetProcessingContext # Adjusted import path assuming asset_context is in processing.pipeline
|
||||
# If AssetProcessingContext is directly under 'processing', the import would be:
|
||||
# from ...asset_context import AssetProcessingContext
|
||||
# Based on the provided file structure, asset_context.py is in processing/pipeline/
|
||||
# So, from ...asset_context import AssetProcessingContext is likely incorrect.
|
||||
# It should be: from ..asset_context import AssetProcessingContext
|
||||
# Correcting this based on typical Python package structure and the location of base_stage.py
|
||||
|
||||
# Re-evaluating import based on common structure:
|
||||
# If base_stage.py is in processing/pipeline/stages/
|
||||
# and asset_context.py is in processing/pipeline/
|
||||
# then the import for AssetProcessingContext from metadata_initialization.py (in stages) would be:
|
||||
# from ..asset_context import AssetProcessingContext
|
||||
|
||||
# Let's assume the following structure for clarity:
|
||||
# processing/
|
||||
# L-- pipeline/
|
||||
# L-- __init__.py
|
||||
# L-- asset_context.py
|
||||
# L-- base_stage.py (Mistake here, base_stage is in stages, so it's ..base_stage)
|
||||
# L-- stages/
|
||||
# L-- __init__.py
|
||||
# L-- metadata_initialization.py
|
||||
# L-- base_stage.py (Corrected: base_stage.py is here)
|
||||
|
||||
# Corrected imports based on the plan and typical structure:
|
||||
# base_stage.py is in processing/pipeline/stages/
|
||||
# asset_context.py is in processing/pipeline/
|
||||
|
||||
# from ..base_stage import ProcessingStage # This would mean base_stage is one level up from stages (i.e. in pipeline)
|
||||
# The plan says: from ..base_stage import ProcessingStage
|
||||
# This implies that metadata_initialization.py is in a subdirectory of where base_stage.py is.
|
||||
# However, the file path for metadata_initialization.py is processing/pipeline/stages/metadata_initialization.py
|
||||
# And base_stage.py is listed as processing/pipeline/stages/base_stage.py in the open tabs.
|
||||
# So, the import should be:
|
||||
# from .base_stage import ProcessingStage
|
||||
|
||||
# AssetProcessingContext is at processing/pipeline/asset_context.py
|
||||
# So from processing/pipeline/stages/metadata_initialization.py, it would be:
|
||||
# from ..asset_context import AssetProcessingContext
|
||||
|
||||
# Final check on imports based on instructions:
|
||||
# `from ..base_stage import ProcessingStage` -> This means base_stage.py is in `processing/pipeline/`
|
||||
# `from ...asset_context import AssetProcessingContext` -> This means asset_context.py is in `processing/`
|
||||
# Let's verify the location of these files from the environment details.
|
||||
# processing/pipeline/asset_context.py
|
||||
# processing/pipeline/stages/base_stage.py
|
||||
#
|
||||
# So, from processing/pipeline/stages/metadata_initialization.py:
|
||||
# To import ProcessingStage from processing/pipeline/stages/base_stage.py:
|
||||
# from .base_stage import ProcessingStage
|
||||
# To import AssetProcessingContext from processing/pipeline/asset_context.py:
|
||||
# from ..asset_context import AssetProcessingContext
|
||||
|
||||
# The instructions explicitly state:
|
||||
# `from ..base_stage import ProcessingStage`
|
||||
# `from ...asset_context import AssetProcessingContext`
|
||||
# This implies a different structure than what seems to be in the file tree.
|
||||
# I will follow the explicit import instructions from the task.
|
||||
# This means:
|
||||
# base_stage.py is expected at `processing/pipeline/base_stage.py`
|
||||
# asset_context.py is expected at `processing/asset_context.py`
|
||||
|
||||
# Given the file tree:
|
||||
# processing/pipeline/asset_context.py
|
||||
# processing/pipeline/stages/base_stage.py
|
||||
# The imports in `processing/pipeline/stages/metadata_initialization.py` should be:
|
||||
# from .base_stage import ProcessingStage
|
||||
# from ..asset_context import AssetProcessingContext
|
||||
|
||||
# I will use the imports that align with the provided file structure.
|
||||
|
||||
from .base_stage import ProcessingStage
|
||||
from ..asset_context import AssetProcessingContext
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
class MetadataInitializationStage(ProcessingStage):
|
||||
"""
|
||||
Initializes metadata structures within the AssetProcessingContext.
|
||||
This stage sets up asset_metadata, processed_maps_details, and
|
||||
merged_maps_details.
|
||||
"""
|
||||
def execute(self, context: AssetProcessingContext) -> AssetProcessingContext:
|
||||
"""
|
||||
Executes the metadata initialization logic.
|
||||
|
||||
Args:
|
||||
context: The AssetProcessingContext for the current asset.
|
||||
|
||||
Returns:
|
||||
The modified AssetProcessingContext.
|
||||
"""
|
||||
if context.status_flags.get('skip_asset', False):
|
||||
logger.debug(f"Asset '{context.asset_rule.name if context.asset_rule else 'Unknown'}': Skipping metadata initialization as 'skip_asset' is True.")
|
||||
return context
|
||||
|
||||
logger.debug(f"Asset '{context.asset_rule.name}': Initializing metadata.")
|
||||
|
||||
context.asset_metadata = {}
|
||||
context.processed_maps_details = {}
|
||||
context.merged_maps_details = {}
|
||||
|
||||
# Populate Initial asset_metadata
|
||||
if context.asset_rule:
|
||||
context.asset_metadata['asset_name'] = context.asset_rule.name
|
||||
context.asset_metadata['asset_id'] = str(context.asset_rule.id)
|
||||
context.asset_metadata['source_path'] = str(context.asset_rule.source_path)
|
||||
context.asset_metadata['output_path_pattern'] = context.asset_rule.output_path_pattern
|
||||
context.asset_metadata['tags'] = list(context.asset_rule.tags) if context.asset_rule.tags else []
|
||||
context.asset_metadata['custom_fields'] = dict(context.asset_rule.custom_fields) if context.asset_rule.custom_fields else {}
|
||||
else:
|
||||
# Handle cases where asset_rule might be None, though typically it should be set
|
||||
logger.warning("AssetRule is not set in context during metadata initialization.")
|
||||
context.asset_metadata['asset_name'] = "Unknown Asset"
|
||||
context.asset_metadata['asset_id'] = "N/A"
|
||||
context.asset_metadata['source_path'] = "N/A"
|
||||
context.asset_metadata['output_path_pattern'] = "N/A"
|
||||
context.asset_metadata['tags'] = []
|
||||
context.asset_metadata['custom_fields'] = {}
|
||||
|
||||
|
||||
if context.source_rule:
|
||||
context.asset_metadata['source_rule_name'] = context.source_rule.name
|
||||
context.asset_metadata['source_rule_id'] = str(context.source_rule.id)
|
||||
else:
|
||||
logger.warning("SourceRule is not set in context during metadata initialization.")
|
||||
context.asset_metadata['source_rule_name'] = "Unknown Source Rule"
|
||||
context.asset_metadata['source_rule_id'] = "N/A"
|
||||
|
||||
context.asset_metadata['effective_supplier'] = context.effective_supplier
|
||||
context.asset_metadata['processing_start_time'] = datetime.datetime.now().isoformat()
|
||||
context.asset_metadata['status'] = "Pending"
|
||||
|
||||
if context.config_obj and hasattr(context.config_obj, 'general_settings') and \
|
||||
hasattr(context.config_obj.general_settings, 'app_version'):
|
||||
context.asset_metadata['version'] = context.config_obj.general_settings.app_version
|
||||
else:
|
||||
logger.warning("App version not found in config_obj.general_settings. Setting version to 'N/A'.")
|
||||
context.asset_metadata['version'] = "N/A" # Default or placeholder
|
||||
|
||||
if context.incrementing_value is not None:
|
||||
context.asset_metadata['incrementing_value'] = context.incrementing_value
|
||||
|
||||
# The plan mentions sha5_value, which is likely a typo for sha256 or similar.
|
||||
# Implementing as 'sha5_value' per instructions, but noting the potential typo.
|
||||
if hasattr(context, 'sha5_value') and context.sha5_value is not None: # Check attribute existence
|
||||
context.asset_metadata['sha5_value'] = context.sha5_value
|
||||
elif hasattr(context, 'sha256_value') and context.sha256_value is not None: # Fallback if sha5 was a typo
|
||||
logger.debug("sha5_value not found, using sha256_value if available for metadata.")
|
||||
context.asset_metadata['sha256_value'] = context.sha256_value
|
||||
|
||||
|
||||
logger.info(f"Asset '{context.asset_metadata.get('asset_name', 'Unknown')}': Metadata initialized.")
|
||||
# Example of how you might log the full metadata for debugging:
|
||||
# logger.debug(f"Initialized metadata: {context.asset_metadata}")
|
||||
|
||||
return context
|
||||
154
processing/pipeline/stages/normal_map_green_channel.py
Normal file
154
processing/pipeline/stages/normal_map_green_channel.py
Normal file
@ -0,0 +1,154 @@
|
||||
import logging
|
||||
import numpy as np
|
||||
from pathlib import Path
|
||||
from typing import List
|
||||
|
||||
from ..base_stage import ProcessingStage
|
||||
from ...asset_context import AssetProcessingContext
|
||||
from .....rule_structure import FileRule
|
||||
from ...utils import image_processing_utils as ipu
|
||||
from .....utils.path_utils import sanitize_filename
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
class NormalMapGreenChannelStage(ProcessingStage):
|
||||
"""
|
||||
Processing stage to invert the green channel of normal maps if configured.
|
||||
This is often needed when converting between DirectX (Y-) and OpenGL (Y+) normal map formats.
|
||||
"""
|
||||
|
||||
def execute(self, context: AssetProcessingContext) -> AssetProcessingContext:
|
||||
"""
|
||||
Identifies NORMAL maps, checks configuration for green channel inversion,
|
||||
performs inversion if needed, saves a new temporary file, and updates
|
||||
the AssetProcessingContext.
|
||||
"""
|
||||
if context.status_flags.get('skip_asset'):
|
||||
logger.debug(f"Asset '{context.asset_rule.name}': Skipping NormalMapGreenChannelStage due to skip_asset flag.")
|
||||
return context
|
||||
|
||||
if not context.files_to_process or not context.processed_maps_details:
|
||||
logger.debug(
|
||||
f"Asset '{context.asset_rule.name}': No files to process or processed_maps_details empty in NormalMapGreenChannelStage. Skipping."
|
||||
)
|
||||
return context
|
||||
|
||||
new_files_to_process: List[FileRule] = []
|
||||
processed_a_normal_map = False
|
||||
|
||||
for file_rule in context.files_to_process:
|
||||
if file_rule.map_type == "NORMAL":
|
||||
# Check configuration for inversion
|
||||
# Assuming a global setting for now.
|
||||
# This key should exist in the Configuration object's general_settings.
|
||||
should_invert = context.config_obj.general_settings.get('invert_normal_map_green_channel_globally', False)
|
||||
|
||||
if not should_invert:
|
||||
logger.debug(
|
||||
f"Asset '{context.asset_rule.name}': Normal map green channel inversion not enabled globally. "
|
||||
f"Skipping for {file_rule.filename_pattern} (ID: {file_rule.id.hex})."
|
||||
)
|
||||
new_files_to_process.append(file_rule)
|
||||
continue
|
||||
|
||||
# Get the temporary processed file path
|
||||
map_details = context.processed_maps_details.get(file_rule.id.hex)
|
||||
if not map_details or map_details.get('status') != 'Processed' or not map_details.get('temp_processed_file'):
|
||||
logger.warning(
|
||||
f"Asset '{context.asset_rule.name}': Normal map {file_rule.filename_pattern} (ID: {file_rule.id.hex}) "
|
||||
f"not found in processed_maps_details or not marked as 'Processed'. Cannot invert green channel."
|
||||
)
|
||||
new_files_to_process.append(file_rule)
|
||||
continue
|
||||
|
||||
original_temp_path = Path(map_details['temp_processed_file'])
|
||||
if not original_temp_path.exists():
|
||||
logger.error(
|
||||
f"Asset '{context.asset_rule.name}': Temporary file {original_temp_path} for normal map "
|
||||
f"{file_rule.filename_pattern} (ID: {file_rule.id.hex}) does not exist. Cannot invert green channel."
|
||||
)
|
||||
new_files_to_process.append(file_rule)
|
||||
continue
|
||||
|
||||
image_data = ipu.load_image(original_temp_path)
|
||||
|
||||
if image_data is None:
|
||||
logger.error(
|
||||
f"Asset '{context.asset_rule.name}': Failed to load image from {original_temp_path} "
|
||||
f"for normal map {file_rule.filename_pattern} (ID: {file_rule.id.hex})."
|
||||
)
|
||||
new_files_to_process.append(file_rule)
|
||||
continue
|
||||
|
||||
if image_data.ndim != 3 or image_data.shape[2] < 2: # Must have at least R, G channels
|
||||
logger.error(
|
||||
f"Asset '{context.asset_rule.name}': Image {original_temp_path} for normal map "
|
||||
f"{file_rule.filename_pattern} (ID: {file_rule.id.hex}) is not a valid RGB/normal map "
|
||||
f"(ndim={image_data.ndim}, channels={image_data.shape[2] if image_data.ndim == 3 else 'N/A'}) "
|
||||
f"for green channel inversion."
|
||||
)
|
||||
new_files_to_process.append(file_rule)
|
||||
continue
|
||||
|
||||
# Perform Green Channel Inversion
|
||||
modified_image_data = image_data.copy()
|
||||
try:
|
||||
if np.issubdtype(modified_image_data.dtype, np.floating):
|
||||
modified_image_data[:, :, 1] = 1.0 - modified_image_data[:, :, 1]
|
||||
elif np.issubdtype(modified_image_data.dtype, np.integer):
|
||||
max_val = np.iinfo(modified_image_data.dtype).max
|
||||
modified_image_data[:, :, 1] = max_val - modified_image_data[:, :, 1]
|
||||
else:
|
||||
logger.error(
|
||||
f"Asset '{context.asset_rule.name}': Unsupported image data type "
|
||||
f"{modified_image_data.dtype} for normal map {original_temp_path}. Cannot invert green channel."
|
||||
)
|
||||
new_files_to_process.append(file_rule)
|
||||
continue
|
||||
except IndexError:
|
||||
logger.error(
|
||||
f"Asset '{context.asset_rule.name}': Image {original_temp_path} for normal map "
|
||||
f"{file_rule.filename_pattern} (ID: {file_rule.id.hex}) does not have a green channel (index 1) "
|
||||
f"or has unexpected dimensions ({modified_image_data.shape}). Cannot invert."
|
||||
)
|
||||
new_files_to_process.append(file_rule)
|
||||
continue
|
||||
|
||||
|
||||
# Save New Temporary (Modified Normal) Map
|
||||
new_temp_filename = f"normal_g_inv_{sanitize_filename(file_rule.map_type)}_{file_rule.id.hex}{original_temp_path.suffix}"
|
||||
new_temp_path = context.engine_temp_dir / new_temp_filename
|
||||
|
||||
save_success = ipu.save_image(new_temp_path, modified_image_data)
|
||||
|
||||
if save_success:
|
||||
logger.info(
|
||||
f"Asset '{context.asset_rule.name}': Inverted green channel for NORMAL map "
|
||||
f"{original_temp_path.name}, saved to {new_temp_path.name}."
|
||||
)
|
||||
# Update processed_maps_details
|
||||
context.processed_maps_details[file_rule.id.hex]['temp_processed_file'] = str(new_temp_path)
|
||||
current_notes = context.processed_maps_details[file_rule.id.hex].get('notes', '')
|
||||
context.processed_maps_details[file_rule.id.hex]['notes'] = \
|
||||
f"{current_notes}; Green channel inverted by NormalMapGreenChannelStage".strip('; ')
|
||||
|
||||
new_files_to_process.append(file_rule) # Add original rule, it now points to modified data
|
||||
processed_a_normal_map = True
|
||||
else:
|
||||
logger.error(
|
||||
f"Asset '{context.asset_rule.name}': Failed to save inverted normal map to {new_temp_path} "
|
||||
f"for original {original_temp_path.name}."
|
||||
)
|
||||
new_files_to_process.append(file_rule) # Add original rule, as processing failed
|
||||
else:
|
||||
# Not a normal map, just pass it through
|
||||
new_files_to_process.append(file_rule)
|
||||
|
||||
context.files_to_process = new_files_to_process
|
||||
|
||||
if processed_a_normal_map:
|
||||
logger.info(f"Asset '{context.asset_rule.name}': NormalMapGreenChannelStage processed relevant normal maps.")
|
||||
else:
|
||||
logger.debug(f"Asset '{context.asset_rule.name}': No normal maps found or processed in NormalMapGreenChannelStage.")
|
||||
|
||||
return context
|
||||
155
processing/pipeline/stages/output_organization.py
Normal file
155
processing/pipeline/stages/output_organization.py
Normal file
@ -0,0 +1,155 @@
|
||||
import logging
|
||||
import shutil
|
||||
from pathlib import Path
|
||||
from typing import List, Dict, Optional
|
||||
|
||||
from ..base_stage import ProcessingStage
|
||||
from ...asset_context import AssetProcessingContext
|
||||
from ....utils.path_utils import generate_path_from_pattern, sanitize_filename
|
||||
from ....config import FileRule, MergeRule # Assuming these are needed for type hints if not directly in context
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
class OutputOrganizationStage(ProcessingStage):
|
||||
"""
|
||||
Organizes output files by copying temporary processed files to their final destinations.
|
||||
"""
|
||||
|
||||
def execute(self, context: AssetProcessingContext) -> AssetProcessingContext:
|
||||
"""
|
||||
Copies temporary processed and merged files to their final output locations
|
||||
based on path patterns and updates AssetProcessingContext.
|
||||
"""
|
||||
logger.debug(f"Asset '{context.asset_rule.name}': Starting output organization stage.")
|
||||
|
||||
if context.status_flags.get('skip_asset'):
|
||||
logger.info(f"Asset '{context.asset_rule.name}': Output organization skipped as 'skip_asset' is True.")
|
||||
return context
|
||||
|
||||
current_status = context.asset_metadata.get('status', '')
|
||||
if current_status.startswith("Failed") or current_status == "Skipped":
|
||||
logger.info(f"Asset '{context.asset_rule.name}': Output organization skipped due to prior status: {current_status}.")
|
||||
return context
|
||||
|
||||
final_output_files: List[str] = []
|
||||
# Ensure config_obj and general_settings are present, provide default for overwrite_existing if not
|
||||
overwrite_existing = False
|
||||
if context.config_obj and hasattr(context.config_obj, 'general_settings'):
|
||||
overwrite_existing = context.config_obj.general_settings.overwrite_existing
|
||||
else:
|
||||
logger.warning(f"Asset '{context.asset_rule.name}': config_obj.general_settings not found, defaulting overwrite_existing to False.")
|
||||
|
||||
|
||||
# A. Organize Processed Individual Maps
|
||||
if context.processed_maps_details:
|
||||
logger.debug(f"Asset '{context.asset_rule.name}': Organizing {len(context.processed_maps_details)} processed individual map(s).")
|
||||
for file_rule_id, details in context.processed_maps_details.items():
|
||||
if details.get('status') != 'Processed' or not details.get('temp_processed_file'):
|
||||
logger.debug(f"Asset '{context.asset_rule.name}': Skipping file_rule_id '{file_rule_id}' due to status '{details.get('status')}' or missing temp file.")
|
||||
continue
|
||||
|
||||
temp_file_path = Path(details['temp_processed_file'])
|
||||
map_type = details['map_type']
|
||||
|
||||
output_filename = f"{context.asset_rule.name}_{sanitize_filename(map_type)}{temp_file_path.suffix}"
|
||||
if context.asset_rule and context.asset_rule.file_rules:
|
||||
current_file_rule: Optional[FileRule] = next(
|
||||
(fr for fr in context.asset_rule.file_rules if fr.id == file_rule_id), None
|
||||
)
|
||||
if current_file_rule and current_file_rule.output_filename_pattern:
|
||||
output_filename = current_file_rule.output_filename_pattern
|
||||
|
||||
try:
|
||||
final_path_str = generate_path_from_pattern(
|
||||
base_path=str(context.output_base_path),
|
||||
pattern=context.asset_rule.output_path_pattern,
|
||||
asset_name=context.asset_rule.name,
|
||||
map_type=map_type,
|
||||
filename=output_filename,
|
||||
source_rule_name=context.source_rule.name if context.source_rule else "DefaultSource",
|
||||
incrementing_value=str(context.incrementing_value) if context.incrementing_value is not None else None,
|
||||
sha5_value=context.sha5_value
|
||||
)
|
||||
final_path = Path(final_path_str)
|
||||
final_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
if final_path.exists() and not overwrite_existing:
|
||||
logger.info(f"Asset '{context.asset_rule.name}': Output file {final_path} exists and overwrite is disabled. Skipping copy.")
|
||||
else:
|
||||
shutil.copy2(temp_file_path, final_path)
|
||||
logger.info(f"Asset '{context.asset_rule.name}': Copied {temp_file_path} to {final_path}")
|
||||
final_output_files.append(str(final_path))
|
||||
|
||||
context.processed_maps_details[file_rule_id]['final_output_path'] = str(final_path)
|
||||
context.processed_maps_details[file_rule_id]['status'] = 'Organized' # Or some other status indicating completion
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Asset '{context.asset_rule.name}': Failed to copy {temp_file_path} to {final_path_str if 'final_path_str' in locals() else 'unknown destination'} for file_rule_id '{file_rule_id}'. Error: {e}", exc_info=True)
|
||||
context.status_flags['output_organization_error'] = True
|
||||
context.asset_metadata['status'] = "Failed (Output Organization Error)"
|
||||
# Optionally update status in details as well
|
||||
context.processed_maps_details[file_rule_id]['status'] = 'Organization Failed'
|
||||
else:
|
||||
logger.debug(f"Asset '{context.asset_rule.name}': No processed individual maps to organize.")
|
||||
|
||||
# B. Organize Merged Maps
|
||||
if context.merged_maps_details:
|
||||
logger.debug(f"Asset '{context.asset_rule.name}': Organizing {len(context.merged_maps_details)} merged map(s).")
|
||||
for merge_rule_id, details in context.merged_maps_details.items():
|
||||
if details.get('status') != 'Processed' or not details.get('temp_merged_file'):
|
||||
logger.debug(f"Asset '{context.asset_rule.name}': Skipping merge_rule_id '{merge_rule_id}' due to status '{details.get('status')}' or missing temp file.")
|
||||
continue
|
||||
|
||||
temp_file_path = Path(details['temp_merged_file'])
|
||||
map_type = details['map_type'] # This is the output_map_type of the merge rule
|
||||
|
||||
output_filename = f"{context.asset_rule.name}_{sanitize_filename(map_type)}{temp_file_path.suffix}"
|
||||
if context.asset_rule and context.asset_rule.merge_rules:
|
||||
current_merge_rule: Optional[MergeRule] = next(
|
||||
(mr for mr in context.asset_rule.merge_rules if mr.id == merge_rule_id), None
|
||||
)
|
||||
if current_merge_rule and current_merge_rule.output_filename_pattern:
|
||||
output_filename = current_merge_rule.output_filename_pattern
|
||||
|
||||
try:
|
||||
final_path_str = generate_path_from_pattern(
|
||||
base_path=str(context.output_base_path),
|
||||
pattern=context.asset_rule.output_path_pattern,
|
||||
asset_name=context.asset_rule.name,
|
||||
map_type=map_type,
|
||||
filename=output_filename,
|
||||
source_rule_name=context.source_rule.name if context.source_rule else "DefaultSource",
|
||||
incrementing_value=str(context.incrementing_value) if context.incrementing_value is not None else None,
|
||||
sha5_value=context.sha5_value
|
||||
)
|
||||
final_path = Path(final_path_str)
|
||||
final_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
if final_path.exists() and not overwrite_existing:
|
||||
logger.info(f"Asset '{context.asset_rule.name}': Output file {final_path} exists and overwrite is disabled. Skipping copy for merged map.")
|
||||
else:
|
||||
shutil.copy2(temp_file_path, final_path)
|
||||
logger.info(f"Asset '{context.asset_rule.name}': Copied merged map {temp_file_path} to {final_path}")
|
||||
final_output_files.append(str(final_path))
|
||||
|
||||
context.merged_maps_details[merge_rule_id]['final_output_path'] = str(final_path)
|
||||
context.merged_maps_details[merge_rule_id]['status'] = 'Organized'
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Asset '{context.asset_rule.name}': Failed to copy merged map {temp_file_path} to {final_path_str if 'final_path_str' in locals() else 'unknown destination'} for merge_rule_id '{merge_rule_id}'. Error: {e}", exc_info=True)
|
||||
context.status_flags['output_organization_error'] = True
|
||||
context.asset_metadata['status'] = "Failed (Output Organization Error)"
|
||||
context.merged_maps_details[merge_rule_id]['status'] = 'Organization Failed'
|
||||
else:
|
||||
logger.debug(f"Asset '{context.asset_rule.name}': No merged maps to organize.")
|
||||
|
||||
context.asset_metadata['final_output_files'] = final_output_files
|
||||
|
||||
if context.status_flags.get('output_organization_error'):
|
||||
logger.error(f"Asset '{context.asset_rule.name}': Output organization encountered errors. Status: {context.asset_metadata['status']}")
|
||||
else:
|
||||
logger.info(f"Asset '{context.asset_rule.name}': Output organization complete. {len(final_output_files)} files placed.")
|
||||
|
||||
logger.debug(f"Asset '{context.asset_rule.name}': Output organization stage finished.")
|
||||
return context
|
||||
61
processing/pipeline/stages/supplier_determination.py
Normal file
61
processing/pipeline/stages/supplier_determination.py
Normal file
@ -0,0 +1,61 @@
|
||||
import logging
|
||||
|
||||
from .base_stage import ProcessingStage
|
||||
from ..asset_context import AssetProcessingContext
|
||||
|
||||
class SupplierDeterminationStage(ProcessingStage):
|
||||
"""
|
||||
Determines the effective supplier for an asset based on asset and source rules.
|
||||
"""
|
||||
|
||||
def execute(self, context: AssetProcessingContext) -> AssetProcessingContext:
|
||||
"""
|
||||
Determines and validates the effective supplier for the asset.
|
||||
|
||||
Args:
|
||||
context: The asset processing context.
|
||||
|
||||
Returns:
|
||||
The updated asset processing context.
|
||||
"""
|
||||
effective_supplier = None
|
||||
logger = logging.getLogger(__name__) # Using a logger specific to this module
|
||||
|
||||
# 1. Check asset_rule.supplier_override
|
||||
if context.asset_rule and context.asset_rule.supplier_override:
|
||||
effective_supplier = context.asset_rule.supplier_override
|
||||
logger.debug(f"Asset '{context.asset_rule.name}': Supplier override found: '{effective_supplier}'.")
|
||||
|
||||
# 2. If not overridden, check source_rule.supplier
|
||||
if not effective_supplier and context.source_rule and context.source_rule.supplier:
|
||||
effective_supplier = context.source_rule.supplier
|
||||
logger.debug(f"Asset '{context.asset_rule.name if context.asset_rule else 'Unknown'}': Source rule supplier found: '{effective_supplier}'.")
|
||||
|
||||
# 3. Validation
|
||||
if not effective_supplier:
|
||||
asset_name = context.asset_rule.name if context.asset_rule else "Unknown Asset"
|
||||
logger.error(f"Asset '{asset_name}': No supplier defined in asset rule or source rule.")
|
||||
context.effective_supplier = None
|
||||
if 'status_flags' not in context: # Ensure status_flags exists
|
||||
context.status_flags = {}
|
||||
context.status_flags['supplier_error'] = True
|
||||
elif context.config_obj and effective_supplier not in context.config_obj.suppliers:
|
||||
asset_name = context.asset_rule.name if context.asset_rule else "Unknown Asset"
|
||||
logger.warning(
|
||||
f"Asset '{asset_name}': Supplier '{effective_supplier}' not found in global supplier configuration. "
|
||||
f"Available: {list(context.config_obj.suppliers.keys()) if context.config_obj.suppliers else 'None'}"
|
||||
)
|
||||
context.effective_supplier = None
|
||||
if 'status_flags' not in context: # Ensure status_flags exists
|
||||
context.status_flags = {}
|
||||
context.status_flags['supplier_error'] = True
|
||||
else:
|
||||
context.effective_supplier = effective_supplier
|
||||
asset_name = context.asset_rule.name if context.asset_rule else "Unknown Asset"
|
||||
logger.info(f"Asset '{asset_name}': Effective supplier set to '{effective_supplier}'.")
|
||||
# Optionally clear the error flag if previously set and now resolved, though current logic doesn't show this path.
|
||||
# if 'status_flags' in context and 'supplier_error' in context.status_flags:
|
||||
# del context.status_flags['supplier_error']
|
||||
|
||||
|
||||
return context
|
||||
1
processing/utils/__init__.py
Normal file
1
processing/utils/__init__.py
Normal file
@ -0,0 +1 @@
|
||||
# This file makes the 'utils' directory a Python package.
|
||||
357
processing/utils/image_processing_utils.py
Normal file
357
processing/utils/image_processing_utils.py
Normal file
@ -0,0 +1,357 @@
|
||||
import cv2
|
||||
import numpy as np
|
||||
from pathlib import Path
|
||||
import math
|
||||
from typing import Optional, Union, List, Tuple, Dict
|
||||
|
||||
# --- Basic Power-of-Two Utilities ---
|
||||
|
||||
def is_power_of_two(n: int) -> bool:
|
||||
"""Checks if a number is a power of two."""
|
||||
return (n > 0) and (n & (n - 1) == 0)
|
||||
|
||||
def get_nearest_pot(value: int) -> int:
|
||||
"""Finds the nearest power of two to the given value."""
|
||||
if value <= 0:
|
||||
return 1 # POT must be positive, return 1 as a fallback
|
||||
if is_power_of_two(value):
|
||||
return value
|
||||
|
||||
lower_pot = 1 << (value.bit_length() - 1)
|
||||
upper_pot = 1 << value.bit_length()
|
||||
|
||||
if (value - lower_pot) < (upper_pot - value):
|
||||
return lower_pot
|
||||
else:
|
||||
return upper_pot
|
||||
|
||||
# --- Dimension Calculation ---
|
||||
|
||||
def calculate_target_dimensions(
|
||||
original_width: int,
|
||||
original_height: int,
|
||||
target_width: Optional[int] = None,
|
||||
target_height: Optional[int] = None,
|
||||
resize_mode: str = "fit", # e.g., "fit", "stretch", "max_dim_pot"
|
||||
ensure_pot: bool = False,
|
||||
allow_upscale: bool = False,
|
||||
target_max_dim_for_pot_mode: Optional[int] = None # Specific for "max_dim_pot"
|
||||
) -> Tuple[int, int]:
|
||||
"""
|
||||
Calculates target dimensions based on various modes and constraints.
|
||||
|
||||
Args:
|
||||
original_width: Original width of the image.
|
||||
original_height: Original height of the image.
|
||||
target_width: Desired target width.
|
||||
target_height: Desired target height.
|
||||
resize_mode:
|
||||
- "fit": Scales to fit within target_width/target_height, maintaining aspect ratio.
|
||||
Requires at least one of target_width or target_height.
|
||||
- "stretch": Scales to exactly target_width and target_height, ignoring aspect ratio.
|
||||
Requires both target_width and target_height.
|
||||
- "max_dim_pot": Scales to fit target_max_dim_for_pot_mode while maintaining aspect ratio,
|
||||
then finds nearest POT for each dimension. Requires target_max_dim_for_pot_mode.
|
||||
ensure_pot: If True, final dimensions will be adjusted to the nearest power of two.
|
||||
allow_upscale: If False, dimensions will not exceed original dimensions unless ensure_pot forces it.
|
||||
target_max_dim_for_pot_mode: Max dimension to use when resize_mode is "max_dim_pot".
|
||||
|
||||
Returns:
|
||||
A tuple (new_width, new_height).
|
||||
"""
|
||||
if original_width <= 0 or original_height <= 0:
|
||||
# Fallback for invalid original dimensions
|
||||
fallback_dim = 1
|
||||
if ensure_pot:
|
||||
if target_width and target_height:
|
||||
fallback_dim = get_nearest_pot(max(target_width, target_height, 1))
|
||||
elif target_width:
|
||||
fallback_dim = get_nearest_pot(target_width)
|
||||
elif target_height:
|
||||
fallback_dim = get_nearest_pot(target_height)
|
||||
elif target_max_dim_for_pot_mode:
|
||||
fallback_dim = get_nearest_pot(target_max_dim_for_pot_mode)
|
||||
else: # Default POT if no target given
|
||||
fallback_dim = 256
|
||||
return (fallback_dim, fallback_dim)
|
||||
return (target_width or 1, target_height or 1)
|
||||
|
||||
|
||||
w, h = original_width, original_height
|
||||
|
||||
if resize_mode == "max_dim_pot":
|
||||
if target_max_dim_for_pot_mode is None:
|
||||
raise ValueError("target_max_dim_for_pot_mode must be provided for 'max_dim_pot' resize_mode.")
|
||||
|
||||
# Logic adapted from old processing_engine.calculate_target_dimensions
|
||||
ratio = w / h
|
||||
if ratio > 1: # Width is dominant
|
||||
scaled_w = target_max_dim_for_pot_mode
|
||||
scaled_h = max(1, round(scaled_w / ratio))
|
||||
else: # Height is dominant or square
|
||||
scaled_h = target_max_dim_for_pot_mode
|
||||
scaled_w = max(1, round(scaled_h * ratio))
|
||||
|
||||
# Upscale check for this mode is implicitly handled by target_max_dim
|
||||
# If ensure_pot is true (as it was in the original logic), it's applied here
|
||||
# For this mode, ensure_pot is effectively always true for the final step
|
||||
w = get_nearest_pot(scaled_w)
|
||||
h = get_nearest_pot(scaled_h)
|
||||
return int(w), int(h)
|
||||
|
||||
elif resize_mode == "fit":
|
||||
if target_width is None and target_height is None:
|
||||
raise ValueError("At least one of target_width or target_height must be provided for 'fit' mode.")
|
||||
|
||||
if target_width and target_height:
|
||||
ratio_orig = w / h
|
||||
ratio_target = target_width / target_height
|
||||
if ratio_orig > ratio_target: # Original is wider than target aspect
|
||||
w_new = target_width
|
||||
h_new = max(1, round(w_new / ratio_orig))
|
||||
else: # Original is taller or same aspect
|
||||
h_new = target_height
|
||||
w_new = max(1, round(h_new * ratio_orig))
|
||||
elif target_width:
|
||||
w_new = target_width
|
||||
h_new = max(1, round(w_new / (w / h)))
|
||||
else: # target_height is not None
|
||||
h_new = target_height
|
||||
w_new = max(1, round(h_new * (w / h)))
|
||||
w, h = w_new, h_new
|
||||
|
||||
elif resize_mode == "stretch":
|
||||
if target_width is None or target_height is None:
|
||||
raise ValueError("Both target_width and target_height must be provided for 'stretch' mode.")
|
||||
w, h = target_width, target_height
|
||||
|
||||
else:
|
||||
raise ValueError(f"Unsupported resize_mode: {resize_mode}")
|
||||
|
||||
if not allow_upscale:
|
||||
if w > original_width: w = original_width
|
||||
if h > original_height: h = original_height
|
||||
|
||||
if ensure_pot:
|
||||
w = get_nearest_pot(w)
|
||||
h = get_nearest_pot(h)
|
||||
# Re-check upscale if POT adjustment made it larger than original and not allowed
|
||||
if not allow_upscale:
|
||||
if w > original_width: w = get_nearest_pot(original_width) # Get closest POT to original
|
||||
if h > original_height: h = get_nearest_pot(original_height)
|
||||
|
||||
|
||||
return int(max(1, w)), int(max(1, h))
|
||||
|
||||
|
||||
# --- Image Statistics ---
|
||||
|
||||
def calculate_image_stats(image_data: np.ndarray) -> Optional[Dict]:
|
||||
"""
|
||||
Calculates min, max, mean for a given numpy image array.
|
||||
Handles grayscale and multi-channel images. Converts to float64 for calculation.
|
||||
Normalizes uint8/uint16 data to 0-1 range before calculating stats.
|
||||
"""
|
||||
if image_data is None:
|
||||
return None
|
||||
try:
|
||||
data_float = image_data.astype(np.float64)
|
||||
|
||||
if image_data.dtype == np.uint16:
|
||||
data_float /= 65535.0
|
||||
elif image_data.dtype == np.uint8:
|
||||
data_float /= 255.0
|
||||
|
||||
stats = {}
|
||||
if len(data_float.shape) == 2: # Grayscale (H, W)
|
||||
stats["min"] = float(np.min(data_float))
|
||||
stats["max"] = float(np.max(data_float))
|
||||
stats["mean"] = float(np.mean(data_float))
|
||||
elif len(data_float.shape) == 3: # Color (H, W, C)
|
||||
stats["min"] = [float(v) for v in np.min(data_float, axis=(0, 1))]
|
||||
stats["max"] = [float(v) for v in np.max(data_float, axis=(0, 1))]
|
||||
stats["mean"] = [float(v) for v in np.mean(data_float, axis=(0, 1))]
|
||||
else:
|
||||
return None # Unsupported shape
|
||||
return stats
|
||||
except Exception:
|
||||
return {"error": "Error calculating image stats"}
|
||||
|
||||
# --- Aspect Ratio String ---
|
||||
|
||||
def normalize_aspect_ratio_change(original_width: int, original_height: int, resized_width: int, resized_height: int, decimals: int = 2) -> str:
|
||||
"""
|
||||
Calculates the aspect ratio change string (e.g., "EVEN", "X133").
|
||||
"""
|
||||
if original_width <= 0 or original_height <= 0:
|
||||
return "InvalidInput"
|
||||
if resized_width <= 0 or resized_height <= 0:
|
||||
return "InvalidResize"
|
||||
|
||||
width_change_percentage = ((resized_width - original_width) / original_width) * 100
|
||||
height_change_percentage = ((resized_height - original_height) / original_height) * 100
|
||||
|
||||
normalized_width_change = width_change_percentage / 100
|
||||
normalized_height_change = height_change_percentage / 100
|
||||
|
||||
normalized_width_change = min(max(normalized_width_change + 1, 0), 2)
|
||||
normalized_height_change = min(max(normalized_height_change + 1, 0), 2)
|
||||
|
||||
epsilon = 1e-9
|
||||
if abs(normalized_width_change) < epsilon and abs(normalized_height_change) < epsilon:
|
||||
closest_value_to_one = 1.0
|
||||
elif abs(normalized_width_change) < epsilon:
|
||||
closest_value_to_one = abs(normalized_height_change)
|
||||
elif abs(normalized_height_change) < epsilon:
|
||||
closest_value_to_one = abs(normalized_width_change)
|
||||
else:
|
||||
closest_value_to_one = min(abs(normalized_width_change), abs(normalized_height_change))
|
||||
|
||||
scale_factor = 1 / (closest_value_to_one + epsilon) if abs(closest_value_to_one) < epsilon else 1 / closest_value_to_one
|
||||
|
||||
scaled_normalized_width_change = scale_factor * normalized_width_change
|
||||
scaled_normalized_height_change = scale_factor * normalized_height_change
|
||||
|
||||
output_width = round(scaled_normalized_width_change, decimals)
|
||||
output_height = round(scaled_normalized_height_change, decimals)
|
||||
|
||||
if abs(output_width - 1.0) < epsilon: output_width = 1
|
||||
if abs(output_height - 1.0) < epsilon: output_height = 1
|
||||
|
||||
if abs(output_width - output_height) < epsilon: # Handles original square or aspect maintained
|
||||
output = "EVEN"
|
||||
elif output_width != 1 and abs(output_height - 1.0) < epsilon : # Width changed, height maintained relative to width
|
||||
output = f"X{str(output_width).replace('.', '')}"
|
||||
elif output_height != 1 and abs(output_width - 1.0) < epsilon: # Height changed, width maintained relative to height
|
||||
output = f"Y{str(output_height).replace('.', '')}"
|
||||
else: # Both changed relative to each other
|
||||
output = f"X{str(output_width).replace('.', '')}Y{str(output_height).replace('.', '')}"
|
||||
return output
|
||||
|
||||
# --- Image Loading, Conversion, Resizing ---
|
||||
|
||||
def load_image(image_path: Union[str, Path], read_flag: int = cv2.IMREAD_UNCHANGED) -> Optional[np.ndarray]:
|
||||
"""Loads an image from the specified path."""
|
||||
try:
|
||||
img = cv2.imread(str(image_path), read_flag)
|
||||
if img is None:
|
||||
# print(f"Warning: Failed to load image: {image_path}") # Optional: for debugging utils
|
||||
return None
|
||||
return img
|
||||
except Exception: # as e:
|
||||
# print(f"Error loading image {image_path}: {e}") # Optional: for debugging utils
|
||||
return None
|
||||
|
||||
def convert_bgr_to_rgb(image: np.ndarray) -> np.ndarray:
|
||||
"""Converts an image from BGR to RGB color space."""
|
||||
if image is None or len(image.shape) < 3:
|
||||
return image # Return as is if not a color image or None
|
||||
|
||||
if image.shape[2] == 4: # BGRA
|
||||
return cv2.cvtColor(image, cv2.COLOR_BGRA2RGB)
|
||||
elif image.shape[2] == 3: # BGR
|
||||
return cv2.cvtColor(image, cv2.COLOR_BGR2RGB)
|
||||
return image # Return as is if not 3 or 4 channels
|
||||
|
||||
def convert_rgb_to_bgr(image: np.ndarray) -> np.ndarray:
|
||||
"""Converts an image from RGB to BGR color space."""
|
||||
if image is None or len(image.shape) < 3 or image.shape[2] != 3: # Only for 3-channel RGB
|
||||
return image # Return as is if not a 3-channel color image or None
|
||||
return cv2.cvtColor(image, cv2.COLOR_RGB2BGR)
|
||||
|
||||
|
||||
def resize_image(image: np.ndarray, target_width: int, target_height: int, interpolation: Optional[int] = None) -> np.ndarray:
|
||||
"""Resizes an image to target_width and target_height."""
|
||||
if image is None:
|
||||
raise ValueError("Cannot resize a None image.")
|
||||
if target_width <= 0 or target_height <= 0:
|
||||
raise ValueError("Target width and height must be positive.")
|
||||
|
||||
original_height, original_width = image.shape[:2]
|
||||
|
||||
if interpolation is None:
|
||||
# Default interpolation: Lanczos for downscaling, Cubic for upscaling/same
|
||||
if (target_width * target_height) < (original_width * original_height):
|
||||
interpolation = cv2.INTER_LANCZOS4
|
||||
else:
|
||||
interpolation = cv2.INTER_CUBIC
|
||||
|
||||
return cv2.resize(image, (target_width, target_height), interpolation=interpolation)
|
||||
|
||||
# --- Image Saving ---
|
||||
|
||||
def save_image(
|
||||
image_path: Union[str, Path],
|
||||
image_data: np.ndarray,
|
||||
output_format: Optional[str] = None, # e.g. "png", "jpg", "exr"
|
||||
output_dtype_target: Optional[np.dtype] = None, # e.g. np.uint8, np.uint16, np.float16
|
||||
params: Optional[List[int]] = None,
|
||||
convert_to_bgr_before_save: bool = True # True for most formats except EXR
|
||||
) -> bool:
|
||||
"""
|
||||
Saves image data to a file. Handles data type and color space conversions.
|
||||
|
||||
Args:
|
||||
image_path: Path to save the image.
|
||||
image_data: NumPy array of the image.
|
||||
output_format: Desired output format (e.g., 'png', 'jpg'). If None, derived from extension.
|
||||
output_dtype_target: Target NumPy dtype for saving (e.g., np.uint8, np.uint16).
|
||||
If None, tries to use image_data.dtype or a sensible default.
|
||||
params: OpenCV imwrite parameters (e.g., [cv2.IMWRITE_JPEG_QUALITY, 90]).
|
||||
convert_to_bgr_before_save: If True and image is 3-channel, converts RGB to BGR.
|
||||
Set to False for formats like EXR that expect RGB.
|
||||
|
||||
Returns:
|
||||
True if saving was successful, False otherwise.
|
||||
"""
|
||||
if image_data is None:
|
||||
return False
|
||||
|
||||
img_to_save = image_data.copy()
|
||||
path_obj = Path(image_path)
|
||||
path_obj.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# 1. Data Type Conversion
|
||||
if output_dtype_target is not None:
|
||||
if output_dtype_target == np.uint8 and img_to_save.dtype != np.uint8:
|
||||
if img_to_save.dtype == np.uint16: img_to_save = (img_to_save.astype(np.float32) / 65535.0 * 255.0).astype(np.uint8)
|
||||
elif img_to_save.dtype in [np.float16, np.float32, np.float64]: img_to_save = (np.clip(img_to_save, 0.0, 1.0) * 255.0).astype(np.uint8)
|
||||
else: img_to_save = img_to_save.astype(np.uint8)
|
||||
elif output_dtype_target == np.uint16 and img_to_save.dtype != np.uint16:
|
||||
if img_to_save.dtype == np.uint8: img_to_save = (img_to_save.astype(np.float32) / 255.0 * 65535.0).astype(np.uint16) # More accurate
|
||||
elif img_to_save.dtype in [np.float16, np.float32, np.float64]: img_to_save = (np.clip(img_to_save, 0.0, 1.0) * 65535.0).astype(np.uint16)
|
||||
else: img_to_save = img_to_save.astype(np.uint16)
|
||||
elif output_dtype_target == np.float16 and img_to_save.dtype != np.float16:
|
||||
if img_to_save.dtype == np.uint16: img_to_save = (img_to_save.astype(np.float32) / 65535.0).astype(np.float16)
|
||||
elif img_to_save.dtype == np.uint8: img_to_save = (img_to_save.astype(np.float32) / 255.0).astype(np.float16)
|
||||
elif img_to_save.dtype in [np.float32, np.float64]: img_to_save = img_to_save.astype(np.float16)
|
||||
# else: cannot convert to float16 easily
|
||||
elif output_dtype_target == np.float32 and img_to_save.dtype != np.float32:
|
||||
if img_to_save.dtype == np.uint16: img_to_save = (img_to_save.astype(np.float32) / 65535.0)
|
||||
elif img_to_save.dtype == np.uint8: img_to_save = (img_to_save.astype(np.float32) / 255.0)
|
||||
elif img_to_save.dtype == np.float16: img_to_save = img_to_save.astype(np.float32)
|
||||
|
||||
|
||||
# 2. Color Space Conversion (RGB -> BGR)
|
||||
# Typically, OpenCV expects BGR for formats like PNG, JPG. EXR usually expects RGB.
|
||||
# The `convert_to_bgr_before_save` flag controls this.
|
||||
# If output_format is exr, this should generally be False.
|
||||
current_format = output_format if output_format else path_obj.suffix.lower().lstrip('.')
|
||||
|
||||
if convert_to_bgr_before_save and current_format != 'exr':
|
||||
if len(img_to_save.shape) == 3 and img_to_save.shape[2] == 3:
|
||||
img_to_save = convert_rgb_to_bgr(img_to_save)
|
||||
# BGRA is handled by OpenCV imwrite for PNGs, no explicit conversion needed if saving as RGBA.
|
||||
# If it's 4-channel and not PNG/TIFF with alpha, it might need stripping or specific handling.
|
||||
# For simplicity, this function assumes 3-channel RGB input if BGR conversion is active.
|
||||
|
||||
# 3. Save Image
|
||||
try:
|
||||
if params:
|
||||
cv2.imwrite(str(path_obj), img_to_save, params)
|
||||
else:
|
||||
cv2.imwrite(str(path_obj), img_to_save)
|
||||
return True
|
||||
except Exception: # as e:
|
||||
# print(f"Error saving image {path_obj}: {e}") # Optional: for debugging utils
|
||||
return False
|
||||
1589
processing_engine.py
1589
processing_engine.py
File diff suppressed because it is too large
Load Diff
1
tests/__init__.py
Normal file
1
tests/__init__.py
Normal file
@ -0,0 +1 @@
|
||||
# This file makes the 'tests' directory a Python package.
|
||||
1
tests/processing/pipeline/__init__.py
Normal file
1
tests/processing/pipeline/__init__.py
Normal file
@ -0,0 +1 @@
|
||||
# This file makes Python treat the directory as a package.
|
||||
1
tests/processing/pipeline/stages/__init__.py
Normal file
1
tests/processing/pipeline/stages/__init__.py
Normal file
@ -0,0 +1 @@
|
||||
# This file makes Python treat the directory as a package.
|
||||
@ -0,0 +1,273 @@
|
||||
import pytest
|
||||
from unittest import mock
|
||||
from pathlib import Path
|
||||
import uuid
|
||||
import numpy as np
|
||||
|
||||
from processing.pipeline.stages.alpha_extraction_to_mask import AlphaExtractionToMaskStage
|
||||
from processing.pipeline.asset_context import AssetProcessingContext
|
||||
from rule_structure import AssetRule, SourceRule, FileRule, TransformSettings
|
||||
from configuration import Configuration, GeneralSettings
|
||||
import processing.utils.image_processing_utils as ipu # Ensure ipu is available for mocking
|
||||
|
||||
# Helper Functions
|
||||
def create_mock_file_rule_for_alpha_test(
|
||||
id_val: uuid.UUID = None,
|
||||
map_type: str = "ALBEDO",
|
||||
filename_pattern: str = "albedo.png",
|
||||
item_type: str = "MAP_COL",
|
||||
active: bool = True
|
||||
) -> mock.MagicMock:
|
||||
mock_fr = mock.MagicMock(spec=FileRule)
|
||||
mock_fr.id = id_val if id_val else uuid.uuid4()
|
||||
mock_fr.map_type = map_type
|
||||
mock_fr.filename_pattern = filename_pattern
|
||||
mock_fr.item_type = item_type
|
||||
mock_fr.active = active
|
||||
mock_fr.transform_settings = mock.MagicMock(spec=TransformSettings)
|
||||
return mock_fr
|
||||
|
||||
def create_alpha_extraction_mock_context(
|
||||
initial_file_rules: list = None,
|
||||
initial_processed_details: dict = None,
|
||||
skip_asset_flag: bool = False,
|
||||
asset_name: str = "AlphaAsset",
|
||||
# extract_alpha_globally: bool = True # If stage checks this
|
||||
) -> AssetProcessingContext:
|
||||
mock_asset_rule = mock.MagicMock(spec=AssetRule)
|
||||
mock_asset_rule.name = asset_name
|
||||
|
||||
mock_source_rule = mock.MagicMock(spec=SourceRule)
|
||||
|
||||
mock_gs = mock.MagicMock(spec=GeneralSettings)
|
||||
# if your stage uses a global flag:
|
||||
# mock_gs.extract_alpha_to_mask_globally = extract_alpha_globally
|
||||
|
||||
mock_config = mock.MagicMock(spec=Configuration)
|
||||
mock_config.general_settings = mock_gs
|
||||
|
||||
context = AssetProcessingContext(
|
||||
source_rule=mock_source_rule,
|
||||
asset_rule=mock_asset_rule,
|
||||
workspace_path=Path("/fake/workspace"),
|
||||
engine_temp_dir=Path("/fake/temp_engine_dir"),
|
||||
output_base_path=Path("/fake/output"),
|
||||
effective_supplier="ValidSupplier",
|
||||
asset_metadata={'asset_name': asset_name},
|
||||
processed_maps_details=initial_processed_details if initial_processed_details is not None else {},
|
||||
merged_maps_details={},
|
||||
files_to_process=list(initial_file_rules) if initial_file_rules else [],
|
||||
loaded_data_cache={},
|
||||
config_obj=mock_config,
|
||||
status_flags={'skip_asset': skip_asset_flag},
|
||||
incrementing_value=None,
|
||||
sha5_value=None
|
||||
)
|
||||
return context
|
||||
|
||||
# Unit Tests
|
||||
@mock.patch('processing.pipeline.stages.alpha_extraction_to_mask.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.alpha_extraction_to_mask.ipu.load_image')
|
||||
@mock.patch('logging.info') # Mock logging to avoid console output during tests
|
||||
def test_asset_skipped(mock_log_info, mock_load_image, mock_save_image):
|
||||
stage = AlphaExtractionToMaskStage()
|
||||
context = create_alpha_extraction_mock_context(skip_asset_flag=True)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert updated_context == context # Context should be unchanged
|
||||
mock_load_image.assert_not_called()
|
||||
mock_save_image.assert_not_called()
|
||||
assert len(updated_context.files_to_process) == 0
|
||||
assert not updated_context.processed_maps_details
|
||||
|
||||
@mock.patch('processing.pipeline.stages.alpha_extraction_to_mask.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.alpha_extraction_to_mask.ipu.load_image')
|
||||
@mock.patch('logging.info')
|
||||
def test_existing_mask_map(mock_log_info, mock_load_image, mock_save_image):
|
||||
stage = AlphaExtractionToMaskStage()
|
||||
|
||||
existing_mask_rule = create_mock_file_rule_for_alpha_test(map_type="MASK", filename_pattern="mask.png")
|
||||
context = create_alpha_extraction_mock_context(initial_file_rules=[existing_mask_rule])
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert updated_context == context
|
||||
mock_load_image.assert_not_called()
|
||||
mock_save_image.assert_not_called()
|
||||
assert len(updated_context.files_to_process) == 1
|
||||
assert updated_context.files_to_process[0].map_type == "MASK"
|
||||
|
||||
@mock.patch('processing.pipeline.stages.alpha_extraction_to_mask.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.alpha_extraction_to_mask.ipu.load_image')
|
||||
@mock.patch('logging.info')
|
||||
def test_alpha_extraction_success(mock_log_info, mock_load_image, mock_save_image):
|
||||
stage = AlphaExtractionToMaskStage()
|
||||
|
||||
albedo_rule_id = uuid.uuid4()
|
||||
albedo_fr = create_mock_file_rule_for_alpha_test(id_val=albedo_rule_id, map_type="ALBEDO")
|
||||
|
||||
initial_processed_details = {
|
||||
albedo_fr.id.hex: {'temp_processed_file': '/fake/temp_engine_dir/processed_albedo.png', 'status': 'Processed', 'map_type': 'ALBEDO', 'source_file_path': Path('/fake/source/albedo.png')}
|
||||
}
|
||||
context = create_alpha_extraction_mock_context(
|
||||
initial_file_rules=[albedo_fr],
|
||||
initial_processed_details=initial_processed_details
|
||||
)
|
||||
|
||||
mock_rgba_data = np.zeros((10, 10, 4), dtype=np.uint8)
|
||||
mock_rgba_data[:, :, 3] = 128 # Example alpha data
|
||||
mock_load_image.side_effect = [mock_rgba_data, mock_rgba_data]
|
||||
|
||||
mock_save_image.return_value = True
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert mock_load_image.call_count == 2
|
||||
# First call to check for alpha, second to get data for saving
|
||||
mock_load_image.assert_any_call(Path('/fake/temp_engine_dir/processed_albedo.png'))
|
||||
|
||||
mock_save_image.assert_called_once()
|
||||
saved_path_arg = mock_save_image.call_args[0][0]
|
||||
saved_data_arg = mock_save_image.call_args[0][1]
|
||||
|
||||
assert isinstance(saved_path_arg, Path)
|
||||
assert "mask_from_alpha_" in saved_path_arg.name
|
||||
assert np.array_equal(saved_data_arg, mock_rgba_data[:, :, 3])
|
||||
|
||||
assert len(updated_context.files_to_process) == 2
|
||||
new_mask_rule = None
|
||||
for fr in updated_context.files_to_process:
|
||||
if fr.map_type == "MASK":
|
||||
new_mask_rule = fr
|
||||
break
|
||||
assert new_mask_rule is not None
|
||||
assert new_mask_rule.item_type == "MAP_DER" # Derived map
|
||||
|
||||
assert new_mask_rule.id.hex in updated_context.processed_maps_details
|
||||
new_mask_detail = updated_context.processed_maps_details[new_mask_rule.id.hex]
|
||||
assert new_mask_detail['map_type'] == "MASK"
|
||||
assert "mask_from_alpha_" in new_mask_detail['temp_processed_file']
|
||||
assert "Generated from alpha of ALBEDO" in new_mask_detail['notes'] # Check for specific note
|
||||
assert new_mask_detail['status'] == 'Processed'
|
||||
|
||||
@mock.patch('processing.pipeline.stages.alpha_extraction_to_mask.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.alpha_extraction_to_mask.ipu.load_image')
|
||||
@mock.patch('logging.info')
|
||||
def test_no_alpha_channel_in_source(mock_log_info, mock_load_image, mock_save_image):
|
||||
stage = AlphaExtractionToMaskStage()
|
||||
|
||||
albedo_rule_id = uuid.uuid4()
|
||||
albedo_fr = create_mock_file_rule_for_alpha_test(id_val=albedo_rule_id, map_type="ALBEDO")
|
||||
initial_processed_details = {
|
||||
albedo_fr.id.hex: {'temp_processed_file': '/fake/temp_engine_dir/processed_rgb_albedo.png', 'status': 'Processed', 'map_type': 'ALBEDO', 'source_file_path': Path('/fake/source/albedo_rgb.png')}
|
||||
}
|
||||
context = create_alpha_extraction_mock_context(
|
||||
initial_file_rules=[albedo_fr],
|
||||
initial_processed_details=initial_processed_details
|
||||
)
|
||||
|
||||
mock_rgb_data = np.zeros((10, 10, 3), dtype=np.uint8) # RGB, no alpha
|
||||
mock_load_image.return_value = mock_rgb_data # Only called once for check
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_called_once_with(Path('/fake/temp_engine_dir/processed_rgb_albedo.png'))
|
||||
mock_save_image.assert_not_called()
|
||||
assert len(updated_context.files_to_process) == 1 # No new MASK rule
|
||||
assert albedo_fr.id.hex in updated_context.processed_maps_details
|
||||
assert len(updated_context.processed_maps_details) == 1
|
||||
|
||||
|
||||
@mock.patch('processing.pipeline.stages.alpha_extraction_to_mask.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.alpha_extraction_to_mask.ipu.load_image')
|
||||
@mock.patch('logging.info')
|
||||
def test_no_suitable_source_map_type(mock_log_info, mock_load_image, mock_save_image):
|
||||
stage = AlphaExtractionToMaskStage()
|
||||
|
||||
normal_rule_id = uuid.uuid4()
|
||||
normal_fr = create_mock_file_rule_for_alpha_test(id_val=normal_rule_id, map_type="NORMAL")
|
||||
initial_processed_details = {
|
||||
normal_fr.id.hex: {'temp_processed_file': '/fake/temp_engine_dir/processed_normal.png', 'status': 'Processed', 'map_type': 'NORMAL'}
|
||||
}
|
||||
context = create_alpha_extraction_mock_context(
|
||||
initial_file_rules=[normal_fr],
|
||||
initial_processed_details=initial_processed_details
|
||||
)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_not_called()
|
||||
mock_save_image.assert_not_called()
|
||||
assert len(updated_context.files_to_process) == 1
|
||||
assert normal_fr.id.hex in updated_context.processed_maps_details
|
||||
assert len(updated_context.processed_maps_details) == 1
|
||||
|
||||
@mock.patch('processing.pipeline.stages.alpha_extraction_to_mask.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.alpha_extraction_to_mask.ipu.load_image')
|
||||
@mock.patch('logging.warning') # Expect a warning log
|
||||
def test_load_image_fails(mock_log_warning, mock_load_image, mock_save_image):
|
||||
stage = AlphaExtractionToMaskStage()
|
||||
|
||||
albedo_rule_id = uuid.uuid4()
|
||||
albedo_fr = create_mock_file_rule_for_alpha_test(id_val=albedo_rule_id, map_type="ALBEDO")
|
||||
initial_processed_details = {
|
||||
albedo_fr.id.hex: {'temp_processed_file': '/fake/temp_engine_dir/processed_albedo_load_fail.png', 'status': 'Processed', 'map_type': 'ALBEDO', 'source_file_path': Path('/fake/source/albedo_load_fail.png')}
|
||||
}
|
||||
context = create_alpha_extraction_mock_context(
|
||||
initial_file_rules=[albedo_fr],
|
||||
initial_processed_details=initial_processed_details
|
||||
)
|
||||
|
||||
mock_load_image.return_value = None # Simulate load failure
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_called_once_with(Path('/fake/temp_engine_dir/processed_albedo_load_fail.png'))
|
||||
mock_save_image.assert_not_called()
|
||||
assert len(updated_context.files_to_process) == 1
|
||||
assert albedo_fr.id.hex in updated_context.processed_maps_details
|
||||
assert len(updated_context.processed_maps_details) == 1
|
||||
mock_log_warning.assert_called_once() # Check that a warning was logged
|
||||
|
||||
@mock.patch('processing.pipeline.stages.alpha_extraction_to_mask.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.alpha_extraction_to_mask.ipu.load_image')
|
||||
@mock.patch('logging.error') # Expect an error log
|
||||
def test_save_image_fails(mock_log_error, mock_load_image, mock_save_image):
|
||||
stage = AlphaExtractionToMaskStage()
|
||||
|
||||
albedo_rule_id = uuid.uuid4()
|
||||
albedo_fr = create_mock_file_rule_for_alpha_test(id_val=albedo_rule_id, map_type="ALBEDO")
|
||||
initial_processed_details = {
|
||||
albedo_fr.id.hex: {'temp_processed_file': '/fake/temp_engine_dir/processed_albedo_save_fail.png', 'status': 'Processed', 'map_type': 'ALBEDO', 'source_file_path': Path('/fake/source/albedo_save_fail.png')}
|
||||
}
|
||||
context = create_alpha_extraction_mock_context(
|
||||
initial_file_rules=[albedo_fr],
|
||||
initial_processed_details=initial_processed_details
|
||||
)
|
||||
|
||||
mock_rgba_data = np.zeros((10, 10, 4), dtype=np.uint8)
|
||||
mock_rgba_data[:, :, 3] = 128
|
||||
mock_load_image.side_effect = [mock_rgba_data, mock_rgba_data] # Load succeeds
|
||||
|
||||
mock_save_image.return_value = False # Simulate save failure
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert mock_load_image.call_count == 2
|
||||
mock_save_image.assert_called_once() # Save was attempted
|
||||
|
||||
assert len(updated_context.files_to_process) == 1 # No new MASK rule should be successfully added and detailed
|
||||
|
||||
# Check that no new MASK details were added, or if they were, they reflect failure.
|
||||
# The current stage logic returns context early, so no new rule or details should be present.
|
||||
mask_rule_found = any(fr.map_type == "MASK" for fr in updated_context.files_to_process)
|
||||
assert not mask_rule_found
|
||||
|
||||
mask_details_found = any(
|
||||
details['map_type'] == "MASK"
|
||||
for fr_id, details in updated_context.processed_maps_details.items()
|
||||
if fr_id != albedo_fr.id.hex # Exclude the original albedo
|
||||
)
|
||||
assert not mask_details_found
|
||||
mock_log_error.assert_called_once() # Check that an error was logged
|
||||
213
tests/processing/pipeline/stages/test_asset_skip_logic.py
Normal file
213
tests/processing/pipeline/stages/test_asset_skip_logic.py
Normal file
@ -0,0 +1,213 @@
|
||||
import pytest
|
||||
from unittest import mock
|
||||
from pathlib import Path
|
||||
from typing import Dict, Optional, Any
|
||||
|
||||
from processing.pipeline.stages.asset_skip_logic import AssetSkipLogicStage
|
||||
from processing.pipeline.asset_context import AssetProcessingContext
|
||||
from rule_structure import AssetRule, SourceRule
|
||||
from configuration import Configuration, GeneralSettings
|
||||
|
||||
# Helper function to create a mock AssetProcessingContext
|
||||
def create_skip_logic_mock_context(
|
||||
effective_supplier: Optional[str] = "ValidSupplier",
|
||||
asset_process_status: str = "PENDING",
|
||||
overwrite_existing: bool = False,
|
||||
asset_name: str = "TestAssetSkip"
|
||||
) -> AssetProcessingContext:
|
||||
mock_asset_rule = mock.MagicMock(spec=AssetRule)
|
||||
mock_asset_rule.name = asset_name
|
||||
mock_asset_rule.process_status = asset_process_status
|
||||
mock_asset_rule.source_path = "fake/source" # Added for completeness
|
||||
mock_asset_rule.output_path = "fake/output" # Added for completeness
|
||||
mock_asset_rule.maps = [] # Added for completeness
|
||||
mock_asset_rule.metadata = {} # Added for completeness
|
||||
mock_asset_rule.material_name = None # Added for completeness
|
||||
mock_asset_rule.notes = None # Added for completeness
|
||||
mock_asset_rule.tags = [] # Added for completeness
|
||||
mock_asset_rule.enabled = True # Added for completeness
|
||||
|
||||
|
||||
mock_source_rule = mock.MagicMock(spec=SourceRule)
|
||||
mock_source_rule.name = "TestSourceRule" # Added for completeness
|
||||
mock_source_rule.path = "fake/source_rule_path" # Added for completeness
|
||||
mock_source_rule.default_supplier = None # Added for completeness
|
||||
mock_source_rule.assets = [mock_asset_rule] # Added for completeness
|
||||
mock_source_rule.enabled = True # Added for completeness
|
||||
|
||||
mock_general_settings = mock.MagicMock(spec=GeneralSettings)
|
||||
mock_general_settings.overwrite_existing = overwrite_existing
|
||||
|
||||
mock_config = mock.MagicMock(spec=Configuration)
|
||||
mock_config.general_settings = mock_general_settings
|
||||
mock_config.suppliers = {"ValidSupplier": mock.MagicMock()}
|
||||
|
||||
context = AssetProcessingContext(
|
||||
source_rule=mock_source_rule,
|
||||
asset_rule=mock_asset_rule,
|
||||
workspace_path=Path("/fake/workspace"),
|
||||
engine_temp_dir=Path("/fake/temp"),
|
||||
output_base_path=Path("/fake/output"),
|
||||
effective_supplier=effective_supplier,
|
||||
asset_metadata={},
|
||||
processed_maps_details={},
|
||||
merged_maps_details={},
|
||||
files_to_process=[],
|
||||
loaded_data_cache={},
|
||||
config_obj=mock_config,
|
||||
status_flags={},
|
||||
incrementing_value=None,
|
||||
sha5_value=None # Corrected from sha5_value to sha256_value if that's the actual field
|
||||
)
|
||||
# Ensure status_flags is initialized if AssetSkipLogicStage expects it
|
||||
# context.status_flags = {} # Already done in constructor
|
||||
return context
|
||||
@mock.patch('logging.info')
|
||||
def test_skip_due_to_missing_supplier(mock_log_info):
|
||||
"""
|
||||
Test that the asset is skipped if effective_supplier is None.
|
||||
"""
|
||||
stage = AssetSkipLogicStage()
|
||||
context = create_skip_logic_mock_context(effective_supplier=None, asset_name="MissingSupplierAsset")
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert updated_context.status_flags.get('skip_asset') is True
|
||||
assert updated_context.status_flags.get('skip_reason') == "Invalid or missing supplier"
|
||||
mock_log_info.assert_any_call(f"Asset 'MissingSupplierAsset': Skipping due to missing or invalid supplier.")
|
||||
|
||||
@mock.patch('logging.info')
|
||||
def test_skip_due_to_process_status_skip(mock_log_info):
|
||||
"""
|
||||
Test that the asset is skipped if asset_rule.process_status is "SKIP".
|
||||
"""
|
||||
stage = AssetSkipLogicStage()
|
||||
context = create_skip_logic_mock_context(asset_process_status="SKIP", asset_name="SkipStatusAsset")
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert updated_context.status_flags.get('skip_asset') is True
|
||||
assert updated_context.status_flags.get('skip_reason') == "Process status set to SKIP"
|
||||
mock_log_info.assert_any_call(f"Asset 'SkipStatusAsset': Skipping because process_status is 'SKIP'.")
|
||||
|
||||
@mock.patch('logging.info')
|
||||
def test_skip_due_to_processed_and_overwrite_disabled(mock_log_info):
|
||||
"""
|
||||
Test that the asset is skipped if asset_rule.process_status is "PROCESSED"
|
||||
and overwrite_existing is False.
|
||||
"""
|
||||
stage = AssetSkipLogicStage()
|
||||
context = create_skip_logic_mock_context(
|
||||
asset_process_status="PROCESSED",
|
||||
overwrite_existing=False,
|
||||
asset_name="ProcessedNoOverwriteAsset"
|
||||
)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert updated_context.status_flags.get('skip_asset') is True
|
||||
assert updated_context.status_flags.get('skip_reason') == "Already processed, overwrite disabled"
|
||||
mock_log_info.assert_any_call(f"Asset 'ProcessedNoOverwriteAsset': Skipping because already processed and overwrite is disabled.")
|
||||
|
||||
@mock.patch('logging.info')
|
||||
def test_no_skip_when_processed_and_overwrite_enabled(mock_log_info):
|
||||
"""
|
||||
Test that the asset is NOT skipped if asset_rule.process_status is "PROCESSED"
|
||||
but overwrite_existing is True.
|
||||
"""
|
||||
stage = AssetSkipLogicStage()
|
||||
context = create_skip_logic_mock_context(
|
||||
asset_process_status="PROCESSED",
|
||||
overwrite_existing=True,
|
||||
effective_supplier="ValidSupplier", # Ensure supplier is valid
|
||||
asset_name="ProcessedOverwriteAsset"
|
||||
)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert updated_context.status_flags.get('skip_asset', False) is False # Default to False if key not present
|
||||
# No specific skip_reason to check if not skipped
|
||||
# Check that no skip log message was called for this specific reason
|
||||
for call_args in mock_log_info.call_args_list:
|
||||
assert "Skipping because already processed and overwrite is disabled" not in call_args[0][0]
|
||||
assert "Skipping due to missing or invalid supplier" not in call_args[0][0]
|
||||
assert "Skipping because process_status is 'SKIP'" not in call_args[0][0]
|
||||
|
||||
|
||||
@mock.patch('logging.info')
|
||||
def test_no_skip_when_process_status_pending(mock_log_info):
|
||||
"""
|
||||
Test that the asset is NOT skipped if asset_rule.process_status is "PENDING".
|
||||
"""
|
||||
stage = AssetSkipLogicStage()
|
||||
context = create_skip_logic_mock_context(
|
||||
asset_process_status="PENDING",
|
||||
effective_supplier="ValidSupplier", # Ensure supplier is valid
|
||||
asset_name="PendingAsset"
|
||||
)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert updated_context.status_flags.get('skip_asset', False) is False
|
||||
# Check that no skip log message was called
|
||||
for call_args in mock_log_info.call_args_list:
|
||||
assert "Skipping" not in call_args[0][0]
|
||||
|
||||
|
||||
@mock.patch('logging.info')
|
||||
def test_no_skip_when_process_status_failed_previously(mock_log_info):
|
||||
"""
|
||||
Test that the asset is NOT skipped if asset_rule.process_status is "FAILED_PREVIOUSLY".
|
||||
"""
|
||||
stage = AssetSkipLogicStage()
|
||||
context = create_skip_logic_mock_context(
|
||||
asset_process_status="FAILED_PREVIOUSLY",
|
||||
effective_supplier="ValidSupplier", # Ensure supplier is valid
|
||||
asset_name="FailedPreviouslyAsset"
|
||||
)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert updated_context.status_flags.get('skip_asset', False) is False
|
||||
# Check that no skip log message was called
|
||||
for call_args in mock_log_info.call_args_list:
|
||||
assert "Skipping" not in call_args[0][0]
|
||||
|
||||
@mock.patch('logging.info')
|
||||
def test_no_skip_when_process_status_other_valid_status(mock_log_info):
|
||||
"""
|
||||
Test that the asset is NOT skipped for other valid, non-skip process statuses.
|
||||
"""
|
||||
stage = AssetSkipLogicStage()
|
||||
context = create_skip_logic_mock_context(
|
||||
asset_process_status="READY_FOR_PROCESSING", # Example of another non-skip status
|
||||
effective_supplier="ValidSupplier",
|
||||
asset_name="ReadyAsset"
|
||||
)
|
||||
updated_context = stage.execute(context)
|
||||
assert updated_context.status_flags.get('skip_asset', False) is False
|
||||
for call_args in mock_log_info.call_args_list:
|
||||
assert "Skipping" not in call_args[0][0]
|
||||
|
||||
@mock.patch('logging.info')
|
||||
def test_skip_asset_flag_initialized_if_not_present(mock_log_info):
|
||||
"""
|
||||
Test that 'skip_asset' is initialized to False in status_flags if not skipped and not present.
|
||||
"""
|
||||
stage = AssetSkipLogicStage()
|
||||
context = create_skip_logic_mock_context(
|
||||
asset_process_status="PENDING",
|
||||
effective_supplier="ValidSupplier",
|
||||
asset_name="InitFlagAsset"
|
||||
)
|
||||
# Ensure status_flags is empty before execute
|
||||
context.status_flags = {}
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
# If not skipped, 'skip_asset' should be explicitly False.
|
||||
assert updated_context.status_flags.get('skip_asset') is False
|
||||
# No skip reason should be set
|
||||
assert 'skip_reason' not in updated_context.status_flags
|
||||
for call_args in mock_log_info.call_args_list:
|
||||
assert "Skipping" not in call_args[0][0]
|
||||
330
tests/processing/pipeline/stages/test_file_rule_filter.py
Normal file
330
tests/processing/pipeline/stages/test_file_rule_filter.py
Normal file
@ -0,0 +1,330 @@
|
||||
import pytest
|
||||
from unittest import mock
|
||||
from pathlib import Path
|
||||
import uuid
|
||||
from typing import Optional # Added Optional for type hinting
|
||||
|
||||
from processing.pipeline.stages.file_rule_filter import FileRuleFilterStage
|
||||
from processing.pipeline.asset_context import AssetProcessingContext
|
||||
from rule_structure import AssetRule, SourceRule, FileRule # FileRule is key here
|
||||
from configuration import Configuration # Minimal config needed
|
||||
|
||||
def create_mock_file_rule(
|
||||
id_val: Optional[uuid.UUID] = None,
|
||||
map_type: str = "Diffuse",
|
||||
filename_pattern: str = "*.tif",
|
||||
item_type: str = "MAP_COL", # e.g., MAP_COL, FILE_IGNORE
|
||||
active: bool = True
|
||||
) -> mock.MagicMock: # Return MagicMock to easily set other attributes if needed
|
||||
mock_fr = mock.MagicMock(spec=FileRule)
|
||||
mock_fr.id = id_val if id_val else uuid.uuid4()
|
||||
mock_fr.map_type = map_type
|
||||
mock_fr.filename_pattern = filename_pattern
|
||||
mock_fr.item_type = item_type
|
||||
mock_fr.active = active
|
||||
return mock_fr
|
||||
|
||||
def create_file_filter_mock_context(
|
||||
file_rules_list: Optional[list] = None, # List of mock FileRule objects
|
||||
skip_asset_flag: bool = False,
|
||||
asset_name: str = "FileFilterAsset"
|
||||
) -> AssetProcessingContext:
|
||||
mock_asset_rule = mock.MagicMock(spec=AssetRule)
|
||||
mock_asset_rule.name = asset_name
|
||||
mock_asset_rule.file_rules = file_rules_list if file_rules_list is not None else []
|
||||
|
||||
mock_source_rule = mock.MagicMock(spec=SourceRule)
|
||||
mock_config = mock.MagicMock(spec=Configuration)
|
||||
|
||||
context = AssetProcessingContext(
|
||||
source_rule=mock_source_rule,
|
||||
asset_rule=mock_asset_rule,
|
||||
workspace_path=Path("/fake/workspace"),
|
||||
engine_temp_dir=Path("/fake/temp"),
|
||||
output_base_path=Path("/fake/output"),
|
||||
effective_supplier="ValidSupplier", # Assume valid for this stage
|
||||
asset_metadata={'asset_name': asset_name}, # Assume metadata init happened
|
||||
processed_maps_details={},
|
||||
merged_maps_details={},
|
||||
files_to_process=[], # Stage will populate this
|
||||
loaded_data_cache={},
|
||||
config_obj=mock_config,
|
||||
status_flags={'skip_asset': skip_asset_flag},
|
||||
incrementing_value=None,
|
||||
sha5_value=None # Corrected from sha5_value to sha256_value based on AssetProcessingContext
|
||||
)
|
||||
return context
|
||||
# Test Cases for FileRuleFilterStage.execute()
|
||||
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.debug')
|
||||
def test_file_rule_filter_asset_skipped(mock_log_debug, mock_log_info):
|
||||
"""
|
||||
Test case: Asset Skipped - status_flags['skip_asset'] is True.
|
||||
Assert context.files_to_process remains empty.
|
||||
"""
|
||||
stage = FileRuleFilterStage()
|
||||
context = create_file_filter_mock_context(skip_asset_flag=True)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert len(updated_context.files_to_process) == 0
|
||||
mock_log_info.assert_any_call(f"Asset '{context.asset_rule.name}': Skipping file rule filtering as 'skip_asset' is True.")
|
||||
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.debug')
|
||||
def test_file_rule_filter_no_file_rules(mock_log_debug, mock_log_info):
|
||||
"""
|
||||
Test case: No File Rules - asset_rule.file_rules is empty.
|
||||
Assert context.files_to_process is empty.
|
||||
"""
|
||||
stage = FileRuleFilterStage()
|
||||
context = create_file_filter_mock_context(file_rules_list=[])
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert len(updated_context.files_to_process) == 0
|
||||
mock_log_info.assert_any_call(f"Asset '{context.asset_rule.name}': No file rules defined. Skipping file rule filtering.")
|
||||
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.debug')
|
||||
def test_file_rule_filter_only_active_processable_rules(mock_log_debug, mock_log_info):
|
||||
"""
|
||||
Test case: Only Active, Processable Rules - All FileRules are active=True and item_type="MAP_COL".
|
||||
Assert all are added to context.files_to_process.
|
||||
"""
|
||||
stage = FileRuleFilterStage()
|
||||
fr1 = create_mock_file_rule(filename_pattern="diffuse.png", item_type="MAP_COL", active=True)
|
||||
fr2 = create_mock_file_rule(filename_pattern="normal.png", item_type="MAP_COL", active=True)
|
||||
context = create_file_filter_mock_context(file_rules_list=[fr1, fr2])
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert len(updated_context.files_to_process) == 2
|
||||
assert fr1 in updated_context.files_to_process
|
||||
assert fr2 in updated_context.files_to_process
|
||||
mock_log_info.assert_any_call(f"Asset '{context.asset_rule.name}': 2 file rules queued for processing after filtering.")
|
||||
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.debug')
|
||||
def test_file_rule_filter_inactive_rules(mock_log_debug, mock_log_info):
|
||||
"""
|
||||
Test case: Inactive Rules - Some FileRules have active=False.
|
||||
Assert only active rules are added.
|
||||
"""
|
||||
stage = FileRuleFilterStage()
|
||||
fr_active = create_mock_file_rule(filename_pattern="active.png", item_type="MAP_COL", active=True)
|
||||
fr_inactive = create_mock_file_rule(filename_pattern="inactive.png", item_type="MAP_COL", active=False)
|
||||
fr_another_active = create_mock_file_rule(filename_pattern="another_active.jpg", item_type="MAP_COL", active=True)
|
||||
context = create_file_filter_mock_context(file_rules_list=[fr_active, fr_inactive, fr_another_active])
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert len(updated_context.files_to_process) == 2
|
||||
assert fr_active in updated_context.files_to_process
|
||||
assert fr_another_active in updated_context.files_to_process
|
||||
assert fr_inactive not in updated_context.files_to_process
|
||||
mock_log_debug.assert_any_call(f"Asset '{context.asset_rule.name}': Skipping inactive file rule: '{fr_inactive.filename_pattern}'")
|
||||
mock_log_info.assert_any_call(f"Asset '{context.asset_rule.name}': 2 file rules queued for processing after filtering.")
|
||||
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.debug')
|
||||
def test_file_rule_filter_file_ignore_simple_match(mock_log_debug, mock_log_info):
|
||||
"""
|
||||
Test case: FILE_IGNORE Rule (Simple Match).
|
||||
One FILE_IGNORE rule with filename_pattern="*_ignore.png".
|
||||
One MAP_COL rule with filename_pattern="diffuse_ignore.png".
|
||||
One MAP_COL rule with filename_pattern="normal_process.png".
|
||||
Assert only "normal_process.png" rule is added.
|
||||
"""
|
||||
stage = FileRuleFilterStage()
|
||||
fr_ignore = create_mock_file_rule(filename_pattern="*_ignore.png", item_type="FILE_IGNORE", active=True)
|
||||
fr_ignored_map = create_mock_file_rule(filename_pattern="diffuse_ignore.png", item_type="MAP_COL", active=True)
|
||||
fr_process_map = create_mock_file_rule(filename_pattern="normal_process.png", item_type="MAP_COL", active=True)
|
||||
context = create_file_filter_mock_context(file_rules_list=[fr_ignore, fr_ignored_map, fr_process_map])
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert len(updated_context.files_to_process) == 1
|
||||
assert fr_process_map in updated_context.files_to_process
|
||||
assert fr_ignored_map not in updated_context.files_to_process
|
||||
mock_log_debug.assert_any_call(f"Asset '{context.asset_rule.name}': Registering ignore pattern: '{fr_ignore.filename_pattern}'")
|
||||
mock_log_debug.assert_any_call(f"Asset '{context.asset_rule.name}': Skipping file rule '{fr_ignored_map.filename_pattern}' due to matching ignore pattern.")
|
||||
mock_log_info.assert_any_call(f"Asset '{context.asset_rule.name}': 1 file rules queued for processing after filtering.")
|
||||
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.debug')
|
||||
def test_file_rule_filter_file_ignore_glob_pattern(mock_log_debug, mock_log_info):
|
||||
"""
|
||||
Test case: FILE_IGNORE Rule (Glob Pattern).
|
||||
One FILE_IGNORE rule with filename_pattern="*_ignore.*".
|
||||
MAP_COL rules: "tex_ignore.tif", "tex_process.png".
|
||||
Assert only "tex_process.png" rule is added.
|
||||
"""
|
||||
stage = FileRuleFilterStage()
|
||||
fr_ignore_glob = create_mock_file_rule(filename_pattern="*_ignore.*", item_type="FILE_IGNORE", active=True)
|
||||
fr_ignored_tif = create_mock_file_rule(filename_pattern="tex_ignore.tif", item_type="MAP_COL", active=True)
|
||||
fr_process_png = create_mock_file_rule(filename_pattern="tex_process.png", item_type="MAP_COL", active=True)
|
||||
context = create_file_filter_mock_context(file_rules_list=[fr_ignore_glob, fr_ignored_tif, fr_process_png])
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert len(updated_context.files_to_process) == 1
|
||||
assert fr_process_png in updated_context.files_to_process
|
||||
assert fr_ignored_tif not in updated_context.files_to_process
|
||||
mock_log_debug.assert_any_call(f"Asset '{context.asset_rule.name}': Registering ignore pattern: '{fr_ignore_glob.filename_pattern}'")
|
||||
mock_log_debug.assert_any_call(f"Asset '{context.asset_rule.name}': Skipping file rule '{fr_ignored_tif.filename_pattern}' due to matching ignore pattern.")
|
||||
mock_log_info.assert_any_call(f"Asset '{context.asset_rule.name}': 1 file rules queued for processing after filtering.")
|
||||
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.debug')
|
||||
def test_file_rule_filter_multiple_file_ignore_rules(mock_log_debug, mock_log_info):
|
||||
"""
|
||||
Test case: Multiple FILE_IGNORE Rules.
|
||||
Test with several ignore patterns and ensure they are all respected.
|
||||
"""
|
||||
stage = FileRuleFilterStage()
|
||||
fr_ignore1 = create_mock_file_rule(filename_pattern="*.tmp", item_type="FILE_IGNORE", active=True)
|
||||
fr_ignore2 = create_mock_file_rule(filename_pattern="backup_*", item_type="FILE_IGNORE", active=True)
|
||||
fr_ignore3 = create_mock_file_rule(filename_pattern="*_old.png", item_type="FILE_IGNORE", active=True)
|
||||
|
||||
fr_map_ignored1 = create_mock_file_rule(filename_pattern="data.tmp", item_type="MAP_COL", active=True)
|
||||
fr_map_ignored2 = create_mock_file_rule(filename_pattern="backup_diffuse.jpg", item_type="MAP_COL", active=True)
|
||||
fr_map_ignored3 = create_mock_file_rule(filename_pattern="normal_old.png", item_type="MAP_COL", active=True)
|
||||
fr_map_process = create_mock_file_rule(filename_pattern="final_texture.tif", item_type="MAP_COL", active=True)
|
||||
|
||||
context = create_file_filter_mock_context(file_rules_list=[
|
||||
fr_ignore1, fr_ignore2, fr_ignore3,
|
||||
fr_map_ignored1, fr_map_ignored2, fr_map_ignored3, fr_map_process
|
||||
])
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert len(updated_context.files_to_process) == 1
|
||||
assert fr_map_process in updated_context.files_to_process
|
||||
assert fr_map_ignored1 not in updated_context.files_to_process
|
||||
assert fr_map_ignored2 not in updated_context.files_to_process
|
||||
assert fr_map_ignored3 not in updated_context.files_to_process
|
||||
mock_log_debug.assert_any_call(f"Asset '{context.asset_rule.name}': Registering ignore pattern: '{fr_ignore1.filename_pattern}'")
|
||||
mock_log_debug.assert_any_call(f"Asset '{context.asset_rule.name}': Registering ignore pattern: '{fr_ignore2.filename_pattern}'")
|
||||
mock_log_debug.assert_any_call(f"Asset '{context.asset_rule.name}': Registering ignore pattern: '{fr_ignore3.filename_pattern}'")
|
||||
mock_log_debug.assert_any_call(f"Asset '{context.asset_rule.name}': Skipping file rule '{fr_map_ignored1.filename_pattern}' due to matching ignore pattern.")
|
||||
mock_log_debug.assert_any_call(f"Asset '{context.asset_rule.name}': Skipping file rule '{fr_map_ignored2.filename_pattern}' due to matching ignore pattern.")
|
||||
mock_log_debug.assert_any_call(f"Asset '{context.asset_rule.name}': Skipping file rule '{fr_map_ignored3.filename_pattern}' due to matching ignore pattern.")
|
||||
mock_log_info.assert_any_call(f"Asset '{context.asset_rule.name}': 1 file rules queued for processing after filtering.")
|
||||
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.debug')
|
||||
def test_file_rule_filter_file_ignore_rule_is_inactive(mock_log_debug, mock_log_info):
|
||||
"""
|
||||
Test case: FILE_IGNORE Rule is Inactive.
|
||||
An ignore rule itself is active=False. Assert its pattern is NOT used for filtering.
|
||||
"""
|
||||
stage = FileRuleFilterStage()
|
||||
fr_inactive_ignore = create_mock_file_rule(filename_pattern="*_ignore.tif", item_type="FILE_IGNORE", active=False)
|
||||
fr_should_process1 = create_mock_file_rule(filename_pattern="diffuse_ignore.tif", item_type="MAP_COL", active=True) # Should be processed
|
||||
fr_should_process2 = create_mock_file_rule(filename_pattern="normal_ok.png", item_type="MAP_COL", active=True)
|
||||
context = create_file_filter_mock_context(file_rules_list=[fr_inactive_ignore, fr_should_process1, fr_should_process2])
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert len(updated_context.files_to_process) == 2
|
||||
assert fr_should_process1 in updated_context.files_to_process
|
||||
assert fr_should_process2 in updated_context.files_to_process
|
||||
# Ensure the inactive ignore rule's pattern was not registered
|
||||
# We check this by ensuring no debug log for registering *that specific* pattern was made.
|
||||
# A more robust way would be to check mock_log_debug.call_args_list, but this is simpler for now.
|
||||
for call in mock_log_debug.call_args_list:
|
||||
args, kwargs = call
|
||||
if "Registering ignore pattern" in args[0] and fr_inactive_ignore.filename_pattern in args[0]:
|
||||
pytest.fail(f"Inactive ignore pattern '{fr_inactive_ignore.filename_pattern}' was incorrectly registered.")
|
||||
mock_log_debug.assert_any_call(f"Asset '{context.asset_rule.name}': Skipping inactive file rule: '{fr_inactive_ignore.filename_pattern}' (type: FILE_IGNORE)")
|
||||
mock_log_info.assert_any_call(f"Asset '{context.asset_rule.name}': 2 file rules queued for processing after filtering.")
|
||||
|
||||
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.debug')
|
||||
def test_file_rule_filter_no_file_ignore_rules(mock_log_debug, mock_log_info):
|
||||
"""
|
||||
Test case: No FILE_IGNORE Rules.
|
||||
All rules are MAP_COL or other processable types.
|
||||
Assert all active, processable rules are included.
|
||||
"""
|
||||
stage = FileRuleFilterStage()
|
||||
fr1 = create_mock_file_rule(filename_pattern="diffuse.png", item_type="MAP_COL", active=True)
|
||||
fr2 = create_mock_file_rule(filename_pattern="normal.png", item_type="MAP_COL", active=True)
|
||||
fr_other_type = create_mock_file_rule(filename_pattern="spec.tif", item_type="MAP_SPEC", active=True) # Assuming MAP_SPEC is processable
|
||||
fr_inactive = create_mock_file_rule(filename_pattern="ao.jpg", item_type="MAP_AO", active=False)
|
||||
|
||||
context = create_file_filter_mock_context(file_rules_list=[fr1, fr2, fr_other_type, fr_inactive])
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert len(updated_context.files_to_process) == 3
|
||||
assert fr1 in updated_context.files_to_process
|
||||
assert fr2 in updated_context.files_to_process
|
||||
assert fr_other_type in updated_context.files_to_process
|
||||
assert fr_inactive not in updated_context.files_to_process
|
||||
mock_log_debug.assert_any_call(f"Asset '{context.asset_rule.name}': Skipping inactive file rule: '{fr_inactive.filename_pattern}'")
|
||||
mock_log_info.assert_any_call(f"Asset '{context.asset_rule.name}': 3 file rules queued for processing after filtering.")
|
||||
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.debug')
|
||||
def test_file_rule_filter_item_type_not_processable(mock_log_debug, mock_log_info):
|
||||
"""
|
||||
Test case: Item type is not processable (e.g., not MAP_COL, MAP_AO etc., but something else like 'METADATA_ONLY').
|
||||
Assert such rules are not added to files_to_process, unless they are FILE_IGNORE.
|
||||
"""
|
||||
stage = FileRuleFilterStage()
|
||||
fr_processable = create_mock_file_rule(filename_pattern="diffuse.png", item_type="MAP_COL", active=True)
|
||||
fr_not_processable = create_mock_file_rule(filename_pattern="info.txt", item_type="METADATA_ONLY", active=True)
|
||||
fr_ignore = create_mock_file_rule(filename_pattern="*.bak", item_type="FILE_IGNORE", active=True)
|
||||
fr_ignored_by_bak = create_mock_file_rule(filename_pattern="diffuse.bak", item_type="MAP_COL", active=True)
|
||||
|
||||
context = create_file_filter_mock_context(file_rules_list=[fr_processable, fr_not_processable, fr_ignore, fr_ignored_by_bak])
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert len(updated_context.files_to_process) == 1
|
||||
assert fr_processable in updated_context.files_to_process
|
||||
assert fr_not_processable not in updated_context.files_to_process
|
||||
assert fr_ignored_by_bak not in updated_context.files_to_process
|
||||
|
||||
mock_log_debug.assert_any_call(f"Asset '{context.asset_rule.name}': Registering ignore pattern: '{fr_ignore.filename_pattern}'")
|
||||
mock_log_debug.assert_any_call(f"Asset '{context.asset_rule.name}': Skipping file rule '{fr_not_processable.filename_pattern}' as its item_type '{fr_not_processable.item_type}' is not processable.")
|
||||
mock_log_debug.assert_any_call(f"Asset '{context.asset_rule.name}': Skipping file rule '{fr_ignored_by_bak.filename_pattern}' due to matching ignore pattern.")
|
||||
mock_log_info.assert_any_call(f"Asset '{context.asset_rule.name}': 1 file rules queued for processing after filtering.")
|
||||
|
||||
# Example tests from instructions (can be adapted or used as a base)
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.debug')
|
||||
def test_file_rule_filter_basic_active_example(mock_log_debug, mock_log_info): # Renamed to avoid conflict
|
||||
stage = FileRuleFilterStage()
|
||||
fr1 = create_mock_file_rule(filename_pattern="diffuse.png", item_type="MAP_COL", active=True)
|
||||
fr2 = create_mock_file_rule(filename_pattern="normal.png", item_type="MAP_COL", active=True)
|
||||
context = create_file_filter_mock_context(file_rules_list=[fr1, fr2])
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert len(updated_context.files_to_process) == 2
|
||||
assert fr1 in updated_context.files_to_process
|
||||
assert fr2 in updated_context.files_to_process
|
||||
mock_log_info.assert_any_call(f"Asset '{context.asset_rule.name}': 2 file rules queued for processing after filtering.")
|
||||
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.debug')
|
||||
def test_file_rule_filter_with_file_ignore_example(mock_log_debug, mock_log_info): # Renamed to avoid conflict
|
||||
stage = FileRuleFilterStage()
|
||||
fr_ignore = create_mock_file_rule(filename_pattern="*_ignore.tif", item_type="FILE_IGNORE", active=True)
|
||||
fr_process = create_mock_file_rule(filename_pattern="diffuse_ok.tif", item_type="MAP_COL", active=True)
|
||||
fr_skip = create_mock_file_rule(filename_pattern="normal_ignore.tif", item_type="MAP_COL", active=True)
|
||||
context = create_file_filter_mock_context(file_rules_list=[fr_ignore, fr_process, fr_skip])
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert len(updated_context.files_to_process) == 1
|
||||
assert fr_process in updated_context.files_to_process
|
||||
assert fr_skip not in updated_context.files_to_process
|
||||
mock_log_debug.assert_any_call(f"Asset '{context.asset_rule.name}': Registering ignore pattern: '{fr_ignore.filename_pattern}'")
|
||||
mock_log_debug.assert_any_call(f"Asset '{context.asset_rule.name}': Skipping file rule '{fr_skip.filename_pattern}' due to matching ignore pattern.")
|
||||
mock_log_info.assert_any_call(f"Asset '{context.asset_rule.name}': 1 file rules queued for processing after filtering.")
|
||||
@ -0,0 +1,486 @@
|
||||
import pytest
|
||||
from unittest import mock
|
||||
from pathlib import Path
|
||||
import uuid
|
||||
import numpy as np
|
||||
from typing import Optional, List, Dict
|
||||
|
||||
from processing.pipeline.stages.gloss_to_rough_conversion import GlossToRoughConversionStage
|
||||
from processing.pipeline.asset_context import AssetProcessingContext
|
||||
from rule_structure import AssetRule, SourceRule, FileRule
|
||||
from configuration import Configuration, GeneralSettings
|
||||
# No direct ipu import needed in test if we mock its usage by the stage
|
||||
|
||||
def create_mock_file_rule_for_gloss_test(
|
||||
id_val: Optional[uuid.UUID] = None,
|
||||
map_type: str = "GLOSS", # Test with GLOSS and other types
|
||||
filename_pattern: str = "gloss.png"
|
||||
) -> mock.MagicMock:
|
||||
mock_fr = mock.MagicMock(spec=FileRule)
|
||||
mock_fr.id = id_val if id_val else uuid.uuid4()
|
||||
mock_fr.map_type = map_type
|
||||
mock_fr.filename_pattern = filename_pattern
|
||||
mock_fr.item_type = "MAP_COL"
|
||||
mock_fr.active = True
|
||||
return mock_fr
|
||||
|
||||
def create_gloss_conversion_mock_context(
|
||||
initial_file_rules: Optional[List[FileRule]] = None, # Type hint corrected
|
||||
initial_processed_details: Optional[Dict] = None, # Type hint corrected
|
||||
skip_asset_flag: bool = False,
|
||||
asset_name: str = "GlossAsset",
|
||||
# Add a mock for general_settings if your stage checks a global flag
|
||||
# convert_gloss_globally: bool = True
|
||||
) -> AssetProcessingContext:
|
||||
mock_asset_rule = mock.MagicMock(spec=AssetRule)
|
||||
mock_asset_rule.name = asset_name
|
||||
mock_asset_rule.file_rules = initial_file_rules if initial_file_rules is not None else []
|
||||
|
||||
mock_source_rule = mock.MagicMock(spec=SourceRule)
|
||||
|
||||
mock_gs = mock.MagicMock(spec=GeneralSettings)
|
||||
# if your stage uses a global flag:
|
||||
# mock_gs.convert_gloss_to_rough_globally = convert_gloss_globally
|
||||
|
||||
mock_config = mock.MagicMock(spec=Configuration)
|
||||
mock_config.general_settings = mock_gs
|
||||
|
||||
|
||||
context = AssetProcessingContext(
|
||||
source_rule=mock_source_rule,
|
||||
asset_rule=mock_asset_rule,
|
||||
workspace_path=Path("/fake/workspace"),
|
||||
engine_temp_dir=Path("/fake/temp_engine_dir"), # Important for new temp file paths
|
||||
output_base_path=Path("/fake/output"),
|
||||
effective_supplier="ValidSupplier",
|
||||
asset_metadata={'asset_name': asset_name},
|
||||
processed_maps_details=initial_processed_details if initial_processed_details is not None else {},
|
||||
merged_maps_details={},
|
||||
files_to_process=list(initial_file_rules) if initial_file_rules else [], # Stage modifies this list
|
||||
loaded_data_cache={},
|
||||
config_obj=mock_config,
|
||||
status_flags={'skip_asset': skip_asset_flag},
|
||||
incrementing_value=None, # Added as per AssetProcessingContext definition
|
||||
sha5_value=None # Added as per AssetProcessingContext definition
|
||||
)
|
||||
return context
|
||||
|
||||
# Unit tests will be added below
|
||||
@mock.patch('processing.pipeline.stages.gloss_to_rough_conversion.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.gloss_to_rough_conversion.ipu.load_image')
|
||||
def test_asset_skipped(mock_load_image, mock_save_image):
|
||||
"""
|
||||
Test that if 'skip_asset' is True, no processing occurs.
|
||||
"""
|
||||
stage = GlossToRoughConversionStage()
|
||||
|
||||
gloss_rule_id = uuid.uuid4()
|
||||
gloss_fr = create_mock_file_rule_for_gloss_test(id_val=gloss_rule_id, map_type="GLOSS")
|
||||
|
||||
initial_details = {
|
||||
gloss_fr.id.hex: {'temp_processed_file': '/fake/temp_engine_dir/processed_gloss_map.png', 'status': 'Processed', 'map_type': 'GLOSS'}
|
||||
}
|
||||
context = create_gloss_conversion_mock_context(
|
||||
initial_file_rules=[gloss_fr],
|
||||
initial_processed_details=initial_details,
|
||||
skip_asset_flag=True # Asset is skipped
|
||||
)
|
||||
|
||||
# Keep a copy of files_to_process and processed_maps_details to compare
|
||||
original_files_to_process = list(context.files_to_process)
|
||||
original_processed_maps_details = context.processed_maps_details.copy()
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_not_called()
|
||||
mock_save_image.assert_not_called()
|
||||
|
||||
assert updated_context.files_to_process == original_files_to_process, "files_to_process should not change if asset is skipped"
|
||||
assert updated_context.processed_maps_details == original_processed_maps_details, "processed_maps_details should not change if asset is skipped"
|
||||
assert updated_context.status_flags['skip_asset'] is True
|
||||
@mock.patch('processing.pipeline.stages.gloss_to_rough_conversion.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.gloss_to_rough_conversion.ipu.load_image')
|
||||
def test_no_gloss_map_present(mock_load_image, mock_save_image):
|
||||
"""
|
||||
Test that if no GLOSS maps are in files_to_process, no conversion occurs.
|
||||
"""
|
||||
stage = GlossToRoughConversionStage()
|
||||
|
||||
normal_rule_id = uuid.uuid4()
|
||||
normal_fr = create_mock_file_rule_for_gloss_test(id_val=normal_rule_id, map_type="NORMAL", filename_pattern="normal.png")
|
||||
albedo_fr = create_mock_file_rule_for_gloss_test(map_type="ALBEDO", filename_pattern="albedo.jpg")
|
||||
|
||||
initial_details = {
|
||||
normal_fr.id.hex: {'temp_processed_file': '/fake/temp_engine_dir/processed_normal_map.png', 'status': 'Processed', 'map_type': 'NORMAL'}
|
||||
}
|
||||
context = create_gloss_conversion_mock_context(
|
||||
initial_file_rules=[normal_fr, albedo_fr],
|
||||
initial_processed_details=initial_details
|
||||
)
|
||||
|
||||
original_files_to_process = list(context.files_to_process)
|
||||
original_processed_maps_details = context.processed_maps_details.copy()
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_not_called()
|
||||
mock_save_image.assert_not_called()
|
||||
|
||||
assert updated_context.files_to_process == original_files_to_process, "files_to_process should not change if no GLOSS maps are present"
|
||||
assert updated_context.processed_maps_details == original_processed_maps_details, "processed_maps_details should not change if no GLOSS maps are present"
|
||||
|
||||
# Ensure map types of existing rules are unchanged
|
||||
for fr_in_list in updated_context.files_to_process:
|
||||
if fr_in_list.id == normal_fr.id:
|
||||
assert fr_in_list.map_type == "NORMAL"
|
||||
elif fr_in_list.id == albedo_fr.id:
|
||||
assert fr_in_list.map_type == "ALBEDO"
|
||||
@mock.patch('processing.pipeline.stages.gloss_to_rough_conversion.logging') # Mock logging
|
||||
@mock.patch('processing.pipeline.stages.gloss_to_rough_conversion.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.gloss_to_rough_conversion.ipu.load_image')
|
||||
def test_gloss_conversion_uint8_success(mock_load_image, mock_save_image, mock_logging):
|
||||
"""
|
||||
Test successful conversion of a GLOSS map (uint8 data) to ROUGHNESS.
|
||||
"""
|
||||
stage = GlossToRoughConversionStage()
|
||||
|
||||
gloss_rule_id = uuid.uuid4()
|
||||
# Use a distinct filename for the gloss map to ensure correct path construction
|
||||
gloss_fr = create_mock_file_rule_for_gloss_test(id_val=gloss_rule_id, map_type="GLOSS", filename_pattern="my_gloss_map.png")
|
||||
other_fr_id = uuid.uuid4()
|
||||
other_fr = create_mock_file_rule_for_gloss_test(id_val=other_fr_id, map_type="NORMAL", filename_pattern="normal_map.png")
|
||||
|
||||
initial_gloss_temp_path = Path("/fake/temp_engine_dir/processed_gloss_map.png")
|
||||
initial_other_temp_path = Path("/fake/temp_engine_dir/processed_normal_map.png")
|
||||
|
||||
initial_details = {
|
||||
gloss_fr.id.hex: {'temp_processed_file': str(initial_gloss_temp_path), 'status': 'Processed', 'map_type': 'GLOSS'},
|
||||
other_fr.id.hex: {'temp_processed_file': str(initial_other_temp_path), 'status': 'Processed', 'map_type': 'NORMAL'}
|
||||
}
|
||||
context = create_gloss_conversion_mock_context(
|
||||
initial_file_rules=[gloss_fr, other_fr],
|
||||
initial_processed_details=initial_details
|
||||
)
|
||||
|
||||
mock_loaded_gloss_data = np.array([10, 50, 250], dtype=np.uint8)
|
||||
mock_load_image.return_value = mock_loaded_gloss_data
|
||||
mock_save_image.return_value = True # Simulate successful save
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_called_once_with(initial_gloss_temp_path)
|
||||
|
||||
# Check that save_image was called with inverted data and correct path
|
||||
expected_inverted_data = 255 - mock_loaded_gloss_data
|
||||
|
||||
# call_args[0] is a tuple of positional args, call_args[1] is a dict of kwargs
|
||||
saved_path_arg = mock_save_image.call_args[0][0]
|
||||
saved_data_arg = mock_save_image.call_args[0][1]
|
||||
|
||||
assert np.array_equal(saved_data_arg, expected_inverted_data), "Image data passed to save_image is not correctly inverted."
|
||||
assert "rough_from_gloss_" in saved_path_arg.name, "Saved file name should indicate conversion from gloss."
|
||||
assert saved_path_arg.parent == Path("/fake/temp_engine_dir"), "Saved file should be in the engine temp directory."
|
||||
# Ensure the new filename is based on the original gloss map's ID for uniqueness
|
||||
assert gloss_fr.id.hex in saved_path_arg.name
|
||||
|
||||
# Check context.files_to_process
|
||||
assert len(updated_context.files_to_process) == 2, "Number of file rules in context should remain the same."
|
||||
converted_rule_found = False
|
||||
other_rule_untouched = False
|
||||
for fr_in_list in updated_context.files_to_process:
|
||||
if fr_in_list.id == gloss_fr.id: # Should be the same rule object, modified
|
||||
assert fr_in_list.map_type == "ROUGHNESS", "GLOSS map_type should be changed to ROUGHNESS."
|
||||
# Check if filename_pattern was updated (optional, depends on stage logic)
|
||||
# For now, assume it might not be, as the primary identifier is map_type and ID
|
||||
converted_rule_found = True
|
||||
elif fr_in_list.id == other_fr.id:
|
||||
assert fr_in_list.map_type == "NORMAL", "Other map_type should remain unchanged."
|
||||
other_rule_untouched = True
|
||||
assert converted_rule_found, "The converted GLOSS rule was not found or not updated correctly in files_to_process."
|
||||
assert other_rule_untouched, "The non-GLOSS rule was modified unexpectedly."
|
||||
|
||||
# Check context.processed_maps_details
|
||||
assert len(updated_context.processed_maps_details) == 2, "Number of entries in processed_maps_details should remain the same."
|
||||
|
||||
gloss_detail = updated_context.processed_maps_details[gloss_fr.id.hex]
|
||||
assert "rough_from_gloss_" in gloss_detail['temp_processed_file'], "temp_processed_file for gloss map not updated."
|
||||
assert Path(gloss_detail['temp_processed_file']).name == saved_path_arg.name, "Path in details should match saved path."
|
||||
assert gloss_detail['original_map_type_before_conversion'] == "GLOSS", "original_map_type_before_conversion not set correctly."
|
||||
assert "Converted from GLOSS to ROUGHNESS" in gloss_detail['notes'], "Conversion notes not added or incorrect."
|
||||
assert gloss_detail['map_type'] == "ROUGHNESS", "map_type in details not updated to ROUGHNESS."
|
||||
|
||||
|
||||
other_detail = updated_context.processed_maps_details[other_fr.id.hex]
|
||||
assert other_detail['temp_processed_file'] == str(initial_other_temp_path), "Other map's temp_processed_file should be unchanged."
|
||||
assert other_detail['map_type'] == "NORMAL", "Other map's map_type should be unchanged."
|
||||
assert 'original_map_type_before_conversion' not in other_detail, "Other map should not have conversion history."
|
||||
assert 'notes' not in other_detail or "Converted from GLOSS" not in other_detail['notes'], "Other map should not have conversion notes."
|
||||
|
||||
mock_logging.info.assert_any_call(f"Successfully converted GLOSS map {gloss_fr.id.hex} to ROUGHNESS.")
|
||||
@mock.patch('processing.pipeline.stages.gloss_to_rough_conversion.logging') # Mock logging
|
||||
@mock.patch('processing.pipeline.stages.gloss_to_rough_conversion.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.gloss_to_rough_conversion.ipu.load_image')
|
||||
def test_gloss_conversion_float_success(mock_load_image, mock_save_image, mock_logging):
|
||||
"""
|
||||
Test successful conversion of a GLOSS map (float data) to ROUGHNESS.
|
||||
"""
|
||||
stage = GlossToRoughConversionStage()
|
||||
|
||||
gloss_rule_id = uuid.uuid4()
|
||||
gloss_fr = create_mock_file_rule_for_gloss_test(id_val=gloss_rule_id, map_type="GLOSS", filename_pattern="gloss_float.hdr") # Example float format
|
||||
|
||||
initial_gloss_temp_path = Path("/fake/temp_engine_dir/processed_gloss_float.hdr")
|
||||
initial_details = {
|
||||
gloss_fr.id.hex: {'temp_processed_file': str(initial_gloss_temp_path), 'status': 'Processed', 'map_type': 'GLOSS'}
|
||||
}
|
||||
context = create_gloss_conversion_mock_context(
|
||||
initial_file_rules=[gloss_fr],
|
||||
initial_processed_details=initial_details
|
||||
)
|
||||
|
||||
mock_loaded_gloss_data = np.array([0.1, 0.5, 0.9], dtype=np.float32)
|
||||
mock_load_image.return_value = mock_loaded_gloss_data
|
||||
mock_save_image.return_value = True # Simulate successful save
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_called_once_with(initial_gloss_temp_path)
|
||||
|
||||
expected_inverted_data = 1.0 - mock_loaded_gloss_data
|
||||
|
||||
saved_path_arg = mock_save_image.call_args[0][0]
|
||||
saved_data_arg = mock_save_image.call_args[0][1]
|
||||
|
||||
assert np.allclose(saved_data_arg, expected_inverted_data), "Image data (float) passed to save_image is not correctly inverted."
|
||||
assert "rough_from_gloss_" in saved_path_arg.name, "Saved file name should indicate conversion from gloss."
|
||||
assert saved_path_arg.parent == Path("/fake/temp_engine_dir"), "Saved file should be in the engine temp directory."
|
||||
assert gloss_fr.id.hex in saved_path_arg.name
|
||||
|
||||
assert len(updated_context.files_to_process) == 1
|
||||
converted_rule = updated_context.files_to_process[0]
|
||||
assert converted_rule.id == gloss_fr.id
|
||||
assert converted_rule.map_type == "ROUGHNESS"
|
||||
|
||||
gloss_detail = updated_context.processed_maps_details[gloss_fr.id.hex]
|
||||
assert "rough_from_gloss_" in gloss_detail['temp_processed_file']
|
||||
assert Path(gloss_detail['temp_processed_file']).name == saved_path_arg.name
|
||||
assert gloss_detail['original_map_type_before_conversion'] == "GLOSS"
|
||||
assert "Converted from GLOSS to ROUGHNESS" in gloss_detail['notes']
|
||||
assert gloss_detail['map_type'] == "ROUGHNESS"
|
||||
|
||||
mock_logging.info.assert_any_call(f"Successfully converted GLOSS map {gloss_fr.id.hex} to ROUGHNESS.")
|
||||
@mock.patch('processing.pipeline.stages.gloss_to_rough_conversion.logging')
|
||||
@mock.patch('processing.pipeline.stages.gloss_to_rough_conversion.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.gloss_to_rough_conversion.ipu.load_image')
|
||||
def test_load_image_fails(mock_load_image, mock_save_image, mock_logging):
|
||||
"""
|
||||
Test behavior when ipu.load_image fails (returns None).
|
||||
The original FileRule should be kept, and an error logged.
|
||||
"""
|
||||
stage = GlossToRoughConversionStage()
|
||||
|
||||
gloss_rule_id = uuid.uuid4()
|
||||
gloss_fr = create_mock_file_rule_for_gloss_test(id_val=gloss_rule_id, map_type="GLOSS", filename_pattern="gloss_fails_load.png")
|
||||
|
||||
initial_gloss_temp_path = Path("/fake/temp_engine_dir/processed_gloss_fails_load.png")
|
||||
initial_details = {
|
||||
gloss_fr.id.hex: {'temp_processed_file': str(initial_gloss_temp_path), 'status': 'Processed', 'map_type': 'GLOSS'}
|
||||
}
|
||||
context = create_gloss_conversion_mock_context(
|
||||
initial_file_rules=[gloss_fr],
|
||||
initial_processed_details=initial_details
|
||||
)
|
||||
|
||||
# Keep a copy for comparison
|
||||
original_file_rule_map_type = gloss_fr.map_type
|
||||
original_details_entry = context.processed_maps_details[gloss_fr.id.hex].copy()
|
||||
|
||||
mock_load_image.return_value = None # Simulate load failure
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_called_once_with(initial_gloss_temp_path)
|
||||
mock_save_image.assert_not_called() # Save should not be attempted
|
||||
|
||||
# Check context.files_to_process: rule should be unchanged
|
||||
assert len(updated_context.files_to_process) == 1
|
||||
processed_rule = updated_context.files_to_process[0]
|
||||
assert processed_rule.id == gloss_fr.id
|
||||
assert processed_rule.map_type == original_file_rule_map_type, "FileRule map_type should not change if load fails."
|
||||
assert processed_rule.map_type == "GLOSS" # Explicitly check it's still GLOSS
|
||||
|
||||
# Check context.processed_maps_details: details should be unchanged
|
||||
current_details_entry = updated_context.processed_maps_details[gloss_fr.id.hex]
|
||||
assert current_details_entry['temp_processed_file'] == str(initial_gloss_temp_path)
|
||||
assert current_details_entry['map_type'] == "GLOSS"
|
||||
assert 'original_map_type_before_conversion' not in current_details_entry
|
||||
assert 'notes' not in current_details_entry or "Converted from GLOSS" not in current_details_entry['notes']
|
||||
|
||||
mock_logging.error.assert_called_once_with(
|
||||
f"Failed to load image data for GLOSS map {gloss_fr.id.hex} from {initial_gloss_temp_path}. Skipping conversion for this map."
|
||||
)
|
||||
@mock.patch('processing.pipeline.stages.gloss_to_rough_conversion.logging')
|
||||
@mock.patch('processing.pipeline.stages.gloss_to_rough_conversion.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.gloss_to_rough_conversion.ipu.load_image')
|
||||
def test_save_image_fails(mock_load_image, mock_save_image, mock_logging):
|
||||
"""
|
||||
Test behavior when ipu.save_image fails (returns False).
|
||||
The original FileRule should be kept, and an error logged.
|
||||
"""
|
||||
stage = GlossToRoughConversionStage()
|
||||
|
||||
gloss_rule_id = uuid.uuid4()
|
||||
gloss_fr = create_mock_file_rule_for_gloss_test(id_val=gloss_rule_id, map_type="GLOSS", filename_pattern="gloss_fails_save.png")
|
||||
|
||||
initial_gloss_temp_path = Path("/fake/temp_engine_dir/processed_gloss_fails_save.png")
|
||||
initial_details = {
|
||||
gloss_fr.id.hex: {'temp_processed_file': str(initial_gloss_temp_path), 'status': 'Processed', 'map_type': 'GLOSS'}
|
||||
}
|
||||
context = create_gloss_conversion_mock_context(
|
||||
initial_file_rules=[gloss_fr],
|
||||
initial_processed_details=initial_details
|
||||
)
|
||||
|
||||
original_file_rule_map_type = gloss_fr.map_type
|
||||
original_details_entry = context.processed_maps_details[gloss_fr.id.hex].copy()
|
||||
|
||||
mock_loaded_gloss_data = np.array([10, 50, 250], dtype=np.uint8)
|
||||
mock_load_image.return_value = mock_loaded_gloss_data
|
||||
mock_save_image.return_value = False # Simulate save failure
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_called_once_with(initial_gloss_temp_path)
|
||||
|
||||
# Check that save_image was called with correct data and path
|
||||
expected_inverted_data = 255 - mock_loaded_gloss_data
|
||||
# call_args[0] is a tuple of positional args
|
||||
saved_path_arg = mock_save_image.call_args[0][0]
|
||||
saved_data_arg = mock_save_image.call_args[0][1]
|
||||
|
||||
assert np.array_equal(saved_data_arg, expected_inverted_data), "Image data passed to save_image is not correctly inverted even on failure."
|
||||
assert "rough_from_gloss_" in saved_path_arg.name, "Attempted save file name should indicate conversion from gloss."
|
||||
assert saved_path_arg.parent == Path("/fake/temp_engine_dir"), "Attempted save file should be in the engine temp directory."
|
||||
|
||||
# Check context.files_to_process: rule should be unchanged
|
||||
assert len(updated_context.files_to_process) == 1
|
||||
processed_rule = updated_context.files_to_process[0]
|
||||
assert processed_rule.id == gloss_fr.id
|
||||
assert processed_rule.map_type == original_file_rule_map_type, "FileRule map_type should not change if save fails."
|
||||
assert processed_rule.map_type == "GLOSS"
|
||||
|
||||
# Check context.processed_maps_details: details should be unchanged
|
||||
current_details_entry = updated_context.processed_maps_details[gloss_fr.id.hex]
|
||||
assert current_details_entry['temp_processed_file'] == str(initial_gloss_temp_path)
|
||||
assert current_details_entry['map_type'] == "GLOSS"
|
||||
assert 'original_map_type_before_conversion' not in current_details_entry
|
||||
assert 'notes' not in current_details_entry or "Converted from GLOSS" not in current_details_entry['notes']
|
||||
|
||||
mock_logging.error.assert_called_once_with(
|
||||
f"Failed to save inverted GLOSS map {gloss_fr.id.hex} to {saved_path_arg}. Retaining original GLOSS map."
|
||||
)
|
||||
@mock.patch('processing.pipeline.stages.gloss_to_rough_conversion.logging')
|
||||
@mock.patch('processing.pipeline.stages.gloss_to_rough_conversion.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.gloss_to_rough_conversion.ipu.load_image')
|
||||
def test_gloss_map_in_files_to_process_but_not_in_details(mock_load_image, mock_save_image, mock_logging):
|
||||
"""
|
||||
Test behavior when a GLOSS FileRule is in files_to_process but its details
|
||||
are missing from processed_maps_details.
|
||||
The stage should log an error and skip this FileRule.
|
||||
"""
|
||||
stage = GlossToRoughConversionStage()
|
||||
|
||||
gloss_rule_id = uuid.uuid4()
|
||||
# This FileRule is in files_to_process
|
||||
gloss_fr_in_list = create_mock_file_rule_for_gloss_test(id_val=gloss_rule_id, map_type="GLOSS", filename_pattern="orphan_gloss.png")
|
||||
|
||||
# processed_maps_details is empty or does not contain gloss_fr_in_list.id.hex
|
||||
initial_details = {}
|
||||
|
||||
context = create_gloss_conversion_mock_context(
|
||||
initial_file_rules=[gloss_fr_in_list],
|
||||
initial_processed_details=initial_details
|
||||
)
|
||||
|
||||
original_files_to_process = list(context.files_to_process)
|
||||
original_processed_maps_details = context.processed_maps_details.copy()
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_not_called() # Load should not be attempted if details are missing
|
||||
mock_save_image.assert_not_called() # Save should not be attempted
|
||||
|
||||
# Check context.files_to_process: rule should be unchanged
|
||||
assert len(updated_context.files_to_process) == 1
|
||||
processed_rule = updated_context.files_to_process[0]
|
||||
assert processed_rule.id == gloss_fr_in_list.id
|
||||
assert processed_rule.map_type == "GLOSS", "FileRule map_type should not change if its details are missing."
|
||||
|
||||
# Check context.processed_maps_details: should remain unchanged
|
||||
assert updated_context.processed_maps_details == original_processed_maps_details, "processed_maps_details should not change."
|
||||
|
||||
mock_logging.error.assert_called_once_with(
|
||||
f"GLOSS map {gloss_fr_in_list.id.hex} found in files_to_process but missing from processed_maps_details. Skipping conversion."
|
||||
)
|
||||
|
||||
# Test for Case 8.2 (GLOSS map ID in processed_maps_details but no corresponding FileRule in files_to_process)
|
||||
# This case is implicitly handled because the stage iterates files_to_process.
|
||||
# If a FileRule isn't in files_to_process, its corresponding entry in processed_maps_details (if any) won't be acted upon.
|
||||
# We can add a simple test to ensure no errors occur and non-relevant details are untouched.
|
||||
|
||||
@mock.patch('processing.pipeline.stages.gloss_to_rough_conversion.logging')
|
||||
@mock.patch('processing.pipeline.stages.gloss_to_rough_conversion.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.gloss_to_rough_conversion.ipu.load_image')
|
||||
def test_gloss_detail_exists_but_not_in_files_to_process(mock_load_image, mock_save_image, mock_logging):
|
||||
"""
|
||||
Test that if a GLOSS map detail exists in processed_maps_details but
|
||||
no corresponding FileRule is in files_to_process, it's simply ignored
|
||||
without error, and other valid conversions proceed.
|
||||
"""
|
||||
stage = GlossToRoughConversionStage()
|
||||
|
||||
# This rule will be processed
|
||||
convert_rule_id = uuid.uuid4()
|
||||
convert_fr = create_mock_file_rule_for_gloss_test(id_val=convert_rule_id, map_type="GLOSS", filename_pattern="convert_me.png")
|
||||
convert_initial_temp_path = Path("/fake/temp_engine_dir/processed_convert_me.png")
|
||||
|
||||
# This rule's details exist, but the rule itself is not in files_to_process
|
||||
orphan_detail_id = uuid.uuid4()
|
||||
|
||||
initial_details = {
|
||||
convert_fr.id.hex: {'temp_processed_file': str(convert_initial_temp_path), 'status': 'Processed', 'map_type': 'GLOSS'},
|
||||
orphan_detail_id.hex: {'temp_processed_file': '/fake/temp_engine_dir/orphan.png', 'status': 'Processed', 'map_type': 'GLOSS', 'notes': 'This is an orphan'}
|
||||
}
|
||||
|
||||
context = create_gloss_conversion_mock_context(
|
||||
initial_file_rules=[convert_fr], # Only convert_fr is in files_to_process
|
||||
initial_processed_details=initial_details
|
||||
)
|
||||
|
||||
mock_loaded_data = np.array([100], dtype=np.uint8)
|
||||
mock_load_image.return_value = mock_loaded_data
|
||||
mock_save_image.return_value = True
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
# Assert that load/save were called only for the rule in files_to_process
|
||||
mock_load_image.assert_called_once_with(convert_initial_temp_path)
|
||||
mock_save_image.assert_called_once() # Check it was called, details checked in other tests
|
||||
|
||||
# Check that the orphan detail in processed_maps_details is untouched
|
||||
assert orphan_detail_id.hex in updated_context.processed_maps_details
|
||||
orphan_entry = updated_context.processed_maps_details[orphan_detail_id.hex]
|
||||
assert orphan_entry['temp_processed_file'] == '/fake/temp_engine_dir/orphan.png'
|
||||
assert orphan_entry['map_type'] == 'GLOSS'
|
||||
assert orphan_entry['notes'] == 'This is an orphan'
|
||||
assert 'original_map_type_before_conversion' not in orphan_entry
|
||||
|
||||
# Check that the processed rule was indeed converted
|
||||
assert convert_fr.id.hex in updated_context.processed_maps_details
|
||||
converted_entry = updated_context.processed_maps_details[convert_fr.id.hex]
|
||||
assert converted_entry['map_type'] == 'ROUGHNESS'
|
||||
assert "rough_from_gloss_" in converted_entry['temp_processed_file']
|
||||
|
||||
# No errors should have been logged regarding the orphan detail
|
||||
for call_args in mock_logging.error.call_args_list:
|
||||
assert str(orphan_detail_id.hex) not in call_args[0][0], "Error logged for orphan detail"
|
||||
@ -0,0 +1,555 @@
|
||||
import pytest
|
||||
from unittest import mock
|
||||
from pathlib import Path
|
||||
import uuid
|
||||
import numpy as np
|
||||
from typing import Optional # Added for type hinting in helper functions
|
||||
|
||||
from processing.pipeline.stages.individual_map_processing import IndividualMapProcessingStage
|
||||
from processing.pipeline.asset_context import AssetProcessingContext
|
||||
from rule_structure import AssetRule, SourceRule, FileRule, TransformSettings # Key models
|
||||
from configuration import Configuration, GeneralSettings
|
||||
# cv2 might be imported by the stage for interpolation constants, ensure it's mockable if so.
|
||||
# For now, assume ipu handles interpolation details.
|
||||
|
||||
def create_mock_transform_settings(
|
||||
target_width=0, target_height=0, resize_mode="FIT",
|
||||
ensure_pot=False, allow_upscale=True, target_color_profile="RGB" # Add other fields as needed
|
||||
) -> mock.MagicMock:
|
||||
ts = mock.MagicMock(spec=TransformSettings)
|
||||
ts.target_width = target_width
|
||||
ts.target_height = target_height
|
||||
ts.resize_mode = resize_mode
|
||||
ts.ensure_pot = ensure_pot
|
||||
ts.allow_upscale = allow_upscale
|
||||
ts.target_color_profile = target_color_profile
|
||||
# ts.resize_filter = "AREA" # if your stage uses this
|
||||
return ts
|
||||
|
||||
def create_mock_file_rule_for_individual_processing(
|
||||
id_val: Optional[uuid.UUID] = None,
|
||||
map_type: str = "ALBEDO",
|
||||
filename_pattern: str = "albedo_*.png", # Pattern for glob
|
||||
item_type: str = "MAP_COL",
|
||||
active: bool = True,
|
||||
transform_settings: Optional[mock.MagicMock] = None
|
||||
) -> mock.MagicMock:
|
||||
mock_fr = mock.MagicMock(spec=FileRule)
|
||||
mock_fr.id = id_val if id_val else uuid.uuid4()
|
||||
mock_fr.map_type = map_type
|
||||
mock_fr.filename_pattern = filename_pattern
|
||||
mock_fr.item_type = item_type
|
||||
mock_fr.active = active
|
||||
mock_fr.transform_settings = transform_settings if transform_settings else create_mock_transform_settings()
|
||||
return mock_fr
|
||||
|
||||
def create_individual_map_proc_mock_context(
|
||||
initial_file_rules: Optional[list] = None,
|
||||
asset_source_path_str: str = "/fake/asset_source",
|
||||
skip_asset_flag: bool = False,
|
||||
asset_name: str = "IndividualMapAsset"
|
||||
) -> AssetProcessingContext:
|
||||
mock_asset_rule = mock.MagicMock(spec=AssetRule)
|
||||
mock_asset_rule.name = asset_name
|
||||
mock_asset_rule.source_path = Path(asset_source_path_str)
|
||||
# file_rules on AssetRule not directly used by stage, context.files_to_process is
|
||||
|
||||
mock_source_rule = mock.MagicMock(spec=SourceRule)
|
||||
mock_config = mock.MagicMock(spec=Configuration)
|
||||
# mock_config.general_settings = mock.MagicMock(spec=GeneralSettings) # If needed
|
||||
|
||||
context = AssetProcessingContext(
|
||||
source_rule=mock_source_rule,
|
||||
asset_rule=mock_asset_rule,
|
||||
workspace_path=Path("/fake/workspace"),
|
||||
engine_temp_dir=Path("/fake/temp_engine_dir"),
|
||||
output_base_path=Path("/fake/output"),
|
||||
effective_supplier="ValidSupplier",
|
||||
asset_metadata={'asset_name': asset_name},
|
||||
processed_maps_details={}, # Stage populates this
|
||||
merged_maps_details={},
|
||||
files_to_process=list(initial_file_rules) if initial_file_rules else [],
|
||||
loaded_data_cache={},
|
||||
config_obj=mock_config,
|
||||
status_flags={'skip_asset': skip_asset_flag},
|
||||
incrementing_value=None,
|
||||
sha5_value=None # Corrected from sha5_value to sha_value if that's the actual param
|
||||
)
|
||||
return context
|
||||
|
||||
# Placeholder for tests to be added next
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu')
|
||||
@mock.patch('logging.info')
|
||||
def test_asset_skipped_if_flag_is_true(mock_log_info, mock_ipu):
|
||||
stage = IndividualMapProcessingStage()
|
||||
context = create_individual_map_proc_mock_context(skip_asset_flag=True)
|
||||
|
||||
# Add a dummy file rule to ensure it's not processed
|
||||
file_rule = create_mock_file_rule_for_individual_processing()
|
||||
context.files_to_process = [file_rule]
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_ipu.load_image.assert_not_called()
|
||||
mock_ipu.save_image.assert_not_called()
|
||||
assert not updated_context.processed_maps_details # No details should be added
|
||||
# Check for a log message indicating skip, if applicable (depends on stage's logging)
|
||||
# mock_log_info.assert_any_call("Skipping asset IndividualMapAsset due to status_flags['skip_asset'] = True") # Example
|
||||
|
||||
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu')
|
||||
@mock.patch('logging.info')
|
||||
def test_no_processing_if_no_map_col_rules(mock_log_info, mock_ipu):
|
||||
stage = IndividualMapProcessingStage()
|
||||
|
||||
# Create a file rule that is NOT of item_type MAP_COL
|
||||
non_map_col_rule = create_mock_file_rule_for_individual_processing(item_type="METADATA")
|
||||
context = create_individual_map_proc_mock_context(initial_file_rules=[non_map_col_rule])
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_ipu.load_image.assert_not_called()
|
||||
mock_ipu.save_image.assert_not_called()
|
||||
assert not updated_context.processed_maps_details
|
||||
# mock_log_info.assert_any_call("No FileRules of item_type 'MAP_COL' to process for asset IndividualMapAsset.") # Example
|
||||
|
||||
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.resize_image')
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.calculate_target_dimensions')
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.load_image')
|
||||
@mock.patch('pathlib.Path.glob') # Mocking Path.glob used by the stage's _find_source_file
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.error')
|
||||
def test_individual_map_processing_success_no_resize(
|
||||
mock_log_error, mock_log_info, mock_path_glob, mock_load_image,
|
||||
mock_calc_dims, mock_resize_image, mock_save_image
|
||||
):
|
||||
stage = IndividualMapProcessingStage()
|
||||
|
||||
source_file_name = "albedo_source.png"
|
||||
# The glob is called on context.asset_rule.source_path, so mock that Path object's glob
|
||||
mock_asset_source_path = Path("/fake/asset_source")
|
||||
mock_found_source_path = mock_asset_source_path / source_file_name
|
||||
|
||||
# We need to mock the glob method of the Path instance
|
||||
# that represents the asset's source directory.
|
||||
# The stage does something like: Path(context.asset_rule.source_path).glob(...)
|
||||
# So, we need to ensure that when Path() is called with that specific string,
|
||||
# the resulting object's glob method is our mock.
|
||||
# A more robust way is to mock Path itself to return a mock object
|
||||
# whose glob method is also a mock.
|
||||
|
||||
# Simpler approach for now: assume Path.glob is used as a static/class method call
|
||||
# or that the instance it's called on is correctly patched by @mock.patch('pathlib.Path.glob')
|
||||
# if the stage does `from pathlib import Path` and then `Path(path_str).glob(...)`.
|
||||
# The prompt example uses @mock.patch('pathlib.Path.glob'), implying the stage might do this:
|
||||
# for f_pattern in patterns:
|
||||
# for found_file in Path(base_dir).glob(f_pattern): ...
|
||||
# Let's refine the mock_path_glob setup.
|
||||
# The stage's _find_source_file likely does:
|
||||
# search_path = Path(self.context.asset_rule.source_path)
|
||||
# found_files = list(search_path.glob(filename_pattern))
|
||||
|
||||
# To correctly mock this, we need to mock the `glob` method of the specific Path instance.
|
||||
# Or, if `_find_source_file` instantiates `Path` like `Path(str(context.asset_rule.source_path)).glob(...)`,
|
||||
# then patching `pathlib.Path.glob` might work if it's treated as a method that gets bound.
|
||||
# Let's stick to the example's @mock.patch('pathlib.Path.glob') and assume it covers the usage.
|
||||
mock_path_glob.return_value = [mock_found_source_path] # Glob finds one file
|
||||
|
||||
ts = create_mock_transform_settings(target_width=100, target_height=100)
|
||||
file_rule = create_mock_file_rule_for_individual_processing(
|
||||
map_type="ALBEDO", filename_pattern="albedo_*.png", transform_settings=ts
|
||||
)
|
||||
context = create_individual_map_proc_mock_context(
|
||||
initial_file_rules=[file_rule],
|
||||
asset_source_path_str=str(mock_asset_source_path) # Ensure context uses this path
|
||||
)
|
||||
|
||||
mock_img_data = np.zeros((100, 100, 3), dtype=np.uint8) # Original dimensions
|
||||
mock_load_image.return_value = mock_img_data
|
||||
mock_calc_dims.return_value = (100, 100) # No resize needed
|
||||
mock_save_image.return_value = True
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
# Assert that Path(context.asset_rule.source_path).glob was called
|
||||
# This requires a bit more intricate mocking if Path instances are created inside.
|
||||
# For now, assert mock_path_glob was called with the pattern.
|
||||
# The actual call in stage is `Path(context.asset_rule.source_path).glob(file_rule.filename_pattern)`
|
||||
# So, `mock_path_glob` (if it patches `Path.glob` globally) should be called.
|
||||
# We need to ensure the mock_path_glob is associated with the correct Path instance or that
|
||||
# the global patch works as intended.
|
||||
# A common pattern is:
|
||||
# with mock.patch.object(Path, 'glob', return_value=[mock_found_source_path]) as specific_glob_mock:
|
||||
# # execute code
|
||||
# specific_glob_mock.assert_called_once_with(file_rule.filename_pattern)
|
||||
# However, the decorator @mock.patch('pathlib.Path.glob') should work if the stage code is
|
||||
# `from pathlib import Path; p = Path(...); p.glob(...)`
|
||||
|
||||
# The stage's _find_source_file will instantiate a Path object from context.asset_rule.source_path
|
||||
# and then call glob on it.
|
||||
# So, @mock.patch('pathlib.Path.glob') is patching the method on the class.
|
||||
# When an instance calls it, the mock is used.
|
||||
mock_path_glob.assert_called_once_with(file_rule.filename_pattern)
|
||||
|
||||
|
||||
mock_load_image.assert_called_once_with(mock_found_source_path)
|
||||
# The actual call to calculate_target_dimensions is:
|
||||
# ipu.calculate_target_dimensions(original_dims, ts.target_width, ts.target_height, ts.resize_mode, ts.ensure_pot, ts.allow_upscale)
|
||||
mock_calc_dims.assert_called_once_with(
|
||||
(100, 100), ts.target_width, ts.target_height, ts.resize_mode, ts.ensure_pot, ts.allow_upscale
|
||||
)
|
||||
mock_resize_image.assert_not_called() # Crucial for this test case
|
||||
mock_save_image.assert_called_once()
|
||||
|
||||
# Check save path and data
|
||||
saved_image_arg, saved_path_arg = mock_save_image.call_args[0]
|
||||
assert np.array_equal(saved_image_arg, mock_img_data) # Ensure correct image data is passed to save
|
||||
assert "processed_ALBEDO_" in saved_path_arg.name # Based on map_type
|
||||
assert file_rule.id.hex in saved_path_arg.name # Ensure unique name with FileRule ID
|
||||
assert saved_path_arg.parent == context.engine_temp_dir
|
||||
|
||||
assert file_rule.id.hex in updated_context.processed_maps_details
|
||||
details = updated_context.processed_maps_details[file_rule.id.hex]
|
||||
assert details['status'] == 'Processed'
|
||||
assert details['source_file'] == str(mock_found_source_path)
|
||||
assert Path(details['temp_processed_file']) == saved_path_arg
|
||||
assert details['original_dimensions'] == (100, 100)
|
||||
assert details['processed_dimensions'] == (100, 100)
|
||||
assert details['map_type'] == file_rule.map_type
|
||||
mock_log_error.assert_not_called()
|
||||
mock_log_info.assert_any_call(f"Successfully processed map {file_rule.map_type} (ID: {file_rule.id.hex}) for asset {context.asset_rule.name}. Output: {saved_path_arg}")
|
||||
|
||||
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.resize_image')
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.calculate_target_dimensions')
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.load_image')
|
||||
@mock.patch('pathlib.Path.glob')
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.error')
|
||||
def test_source_file_not_found(
|
||||
mock_log_error, mock_log_info, mock_path_glob, mock_load_image,
|
||||
mock_calc_dims, mock_resize_image, mock_save_image
|
||||
):
|
||||
stage = IndividualMapProcessingStage()
|
||||
mock_asset_source_path = Path("/fake/asset_source")
|
||||
|
||||
mock_path_glob.return_value = [] # Glob finds no files
|
||||
|
||||
file_rule = create_mock_file_rule_for_individual_processing(filename_pattern="nonexistent_*.png")
|
||||
context = create_individual_map_proc_mock_context(
|
||||
initial_file_rules=[file_rule],
|
||||
asset_source_path_str=str(mock_asset_source_path)
|
||||
)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_path_glob.assert_called_once_with(file_rule.filename_pattern)
|
||||
mock_load_image.assert_not_called()
|
||||
mock_calc_dims.assert_not_called()
|
||||
mock_resize_image.assert_not_called()
|
||||
mock_save_image.assert_not_called()
|
||||
|
||||
assert file_rule.id.hex in updated_context.processed_maps_details
|
||||
details = updated_context.processed_maps_details[file_rule.id.hex]
|
||||
assert details['status'] == 'Source Not Found'
|
||||
assert details['source_file'] is None
|
||||
assert details['temp_processed_file'] is None
|
||||
assert details['error_message'] is not None # Check an error message is present
|
||||
mock_log_error.assert_called_once()
|
||||
# Example: mock_log_error.assert_called_with(f"Could not find source file for rule {file_rule.id} (pattern: {file_rule.filename_pattern}) in {context.asset_rule.source_path}")
|
||||
|
||||
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.resize_image')
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.calculate_target_dimensions')
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.load_image')
|
||||
@mock.patch('pathlib.Path.glob')
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.error')
|
||||
def test_load_image_fails(
|
||||
mock_log_error, mock_log_info, mock_path_glob, mock_load_image,
|
||||
mock_calc_dims, mock_resize_image, mock_save_image
|
||||
):
|
||||
stage = IndividualMapProcessingStage()
|
||||
source_file_name = "albedo_corrupt.png"
|
||||
mock_asset_source_path = Path("/fake/asset_source")
|
||||
mock_found_source_path = mock_asset_source_path / source_file_name
|
||||
mock_path_glob.return_value = [mock_found_source_path]
|
||||
|
||||
mock_load_image.return_value = None # Simulate load failure
|
||||
|
||||
file_rule = create_mock_file_rule_for_individual_processing(filename_pattern="albedo_*.png")
|
||||
context = create_individual_map_proc_mock_context(
|
||||
initial_file_rules=[file_rule],
|
||||
asset_source_path_str=str(mock_asset_source_path)
|
||||
)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_path_glob.assert_called_once_with(file_rule.filename_pattern)
|
||||
mock_load_image.assert_called_once_with(mock_found_source_path)
|
||||
mock_calc_dims.assert_not_called()
|
||||
mock_resize_image.assert_not_called()
|
||||
mock_save_image.assert_not_called()
|
||||
|
||||
assert file_rule.id.hex in updated_context.processed_maps_details
|
||||
details = updated_context.processed_maps_details[file_rule.id.hex]
|
||||
assert details['status'] == 'Load Failed'
|
||||
assert details['source_file'] == str(mock_found_source_path)
|
||||
assert details['temp_processed_file'] is None
|
||||
assert details['error_message'] is not None
|
||||
mock_log_error.assert_called_once()
|
||||
# Example: mock_log_error.assert_called_with(f"Failed to load image {mock_found_source_path} for rule {file_rule.id}")
|
||||
|
||||
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.resize_image')
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.calculate_target_dimensions')
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.load_image')
|
||||
@mock.patch('pathlib.Path.glob')
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.error')
|
||||
def test_resize_occurs_when_dimensions_differ(
|
||||
mock_log_error, mock_log_info, mock_path_glob, mock_load_image,
|
||||
mock_calc_dims, mock_resize_image, mock_save_image
|
||||
):
|
||||
stage = IndividualMapProcessingStage()
|
||||
source_file_name = "albedo_resize.png"
|
||||
mock_asset_source_path = Path("/fake/asset_source")
|
||||
mock_found_source_path = mock_asset_source_path / source_file_name
|
||||
mock_path_glob.return_value = [mock_found_source_path]
|
||||
|
||||
original_dims = (100, 100)
|
||||
target_dims = (50, 50) # Different dimensions
|
||||
mock_img_data = np.zeros((*original_dims, 3), dtype=np.uint8)
|
||||
mock_resized_img_data = np.zeros((*target_dims, 3), dtype=np.uint8)
|
||||
|
||||
mock_load_image.return_value = mock_img_data
|
||||
ts = create_mock_transform_settings(target_width=target_dims[0], target_height=target_dims[1])
|
||||
file_rule = create_mock_file_rule_for_individual_processing(transform_settings=ts)
|
||||
context = create_individual_map_proc_mock_context(
|
||||
initial_file_rules=[file_rule],
|
||||
asset_source_path_str=str(mock_asset_source_path)
|
||||
)
|
||||
|
||||
mock_calc_dims.return_value = target_dims # Simulate calc_dims returning new dimensions
|
||||
mock_resize_image.return_value = mock_resized_img_data # Simulate resize returning new image data
|
||||
mock_save_image.return_value = True
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_called_once_with(mock_found_source_path)
|
||||
mock_calc_dims.assert_called_once_with(
|
||||
original_dims, ts.target_width, ts.target_height, ts.resize_mode, ts.ensure_pot, ts.allow_upscale
|
||||
)
|
||||
# The actual call to resize_image is:
|
||||
# ipu.resize_image(loaded_image, target_dims, ts.resize_filter) # Assuming resize_filter is used
|
||||
# If resize_filter is not on TransformSettings or not used, adjust this.
|
||||
# For now, let's assume it's ipu.resize_image(loaded_image, target_dims) or similar
|
||||
# The stage code is: resized_image = ipu.resize_image(loaded_image, target_dims_calculated, file_rule.transform_settings.resize_filter)
|
||||
# So we need to mock ts.resize_filter
|
||||
ts.resize_filter = "LANCZOS4" # Example filter
|
||||
mock_resize_image.assert_called_once_with(mock_img_data, target_dims, ts.resize_filter)
|
||||
|
||||
saved_image_arg, saved_path_arg = mock_save_image.call_args[0]
|
||||
assert np.array_equal(saved_image_arg, mock_resized_img_data) # Check resized data is saved
|
||||
assert "processed_ALBEDO_" in saved_path_arg.name
|
||||
assert saved_path_arg.parent == context.engine_temp_dir
|
||||
|
||||
assert file_rule.id.hex in updated_context.processed_maps_details
|
||||
details = updated_context.processed_maps_details[file_rule.id.hex]
|
||||
assert details['status'] == 'Processed'
|
||||
assert details['original_dimensions'] == original_dims
|
||||
assert details['processed_dimensions'] == target_dims
|
||||
mock_log_error.assert_not_called()
|
||||
|
||||
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.resize_image')
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.calculate_target_dimensions')
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.load_image')
|
||||
@mock.patch('pathlib.Path.glob')
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.error')
|
||||
def test_save_image_fails(
|
||||
mock_log_error, mock_log_info, mock_path_glob, mock_load_image,
|
||||
mock_calc_dims, mock_resize_image, mock_save_image
|
||||
):
|
||||
stage = IndividualMapProcessingStage()
|
||||
source_file_name = "albedo_save_fail.png"
|
||||
mock_asset_source_path = Path("/fake/asset_source")
|
||||
mock_found_source_path = mock_asset_source_path / source_file_name
|
||||
mock_path_glob.return_value = [mock_found_source_path]
|
||||
|
||||
mock_img_data = np.zeros((100, 100, 3), dtype=np.uint8)
|
||||
mock_load_image.return_value = mock_img_data
|
||||
mock_calc_dims.return_value = (100, 100) # No resize
|
||||
mock_save_image.return_value = False # Simulate save failure
|
||||
|
||||
ts = create_mock_transform_settings()
|
||||
file_rule = create_mock_file_rule_for_individual_processing(transform_settings=ts)
|
||||
context = create_individual_map_proc_mock_context(
|
||||
initial_file_rules=[file_rule],
|
||||
asset_source_path_str=str(mock_asset_source_path)
|
||||
)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_save_image.assert_called_once() # Attempt to save should still happen
|
||||
|
||||
assert file_rule.id.hex in updated_context.processed_maps_details
|
||||
details = updated_context.processed_maps_details[file_rule.id.hex]
|
||||
assert details['status'] == 'Save Failed'
|
||||
assert details['source_file'] == str(mock_found_source_path)
|
||||
assert details['temp_processed_file'] is not None # Path was generated
|
||||
assert details['error_message'] is not None
|
||||
mock_log_error.assert_called_once()
|
||||
# Example: mock_log_error.assert_called_with(f"Failed to save processed image for rule {file_rule.id} to {details['temp_processed_file']}")
|
||||
|
||||
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.resize_image')
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.calculate_target_dimensions')
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.load_image')
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.convert_bgr_to_rgb')
|
||||
@mock.patch('pathlib.Path.glob')
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.error')
|
||||
def test_color_conversion_bgr_to_rgb(
|
||||
mock_log_error, mock_log_info, mock_path_glob, mock_convert_bgr, mock_load_image,
|
||||
mock_calc_dims, mock_resize_image, mock_save_image
|
||||
):
|
||||
stage = IndividualMapProcessingStage()
|
||||
source_file_name = "albedo_bgr.png"
|
||||
mock_asset_source_path = Path("/fake/asset_source")
|
||||
mock_found_source_path = mock_asset_source_path / source_file_name
|
||||
mock_path_glob.return_value = [mock_found_source_path]
|
||||
|
||||
mock_bgr_img_data = np.zeros((100, 100, 3), dtype=np.uint8) # Loaded as BGR
|
||||
mock_rgb_img_data = np.zeros((100, 100, 3), dtype=np.uint8) # After conversion
|
||||
|
||||
mock_load_image.return_value = mock_bgr_img_data # Image is loaded (assume BGR by default from cv2)
|
||||
mock_convert_bgr.return_value = mock_rgb_img_data # Mock the conversion
|
||||
mock_calc_dims.return_value = (100, 100) # No resize
|
||||
mock_save_image.return_value = True
|
||||
|
||||
# Transform settings request RGB, and stage assumes load might be BGR
|
||||
ts = create_mock_transform_settings(target_color_profile="RGB")
|
||||
file_rule = create_mock_file_rule_for_individual_processing(transform_settings=ts)
|
||||
context = create_individual_map_proc_mock_context(
|
||||
initial_file_rules=[file_rule],
|
||||
asset_source_path_str=str(mock_asset_source_path)
|
||||
)
|
||||
# The stage code is:
|
||||
# if file_rule.transform_settings.target_color_profile == "RGB" and loaded_image.shape[2] == 3:
|
||||
# logger.info(f"Attempting to convert image from BGR to RGB for {file_rule_id_hex}")
|
||||
# processed_image_data = ipu.convert_bgr_to_rgb(processed_image_data)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_called_once_with(mock_found_source_path)
|
||||
mock_convert_bgr.assert_called_once_with(mock_bgr_img_data)
|
||||
mock_resize_image.assert_not_called()
|
||||
|
||||
saved_image_arg, _ = mock_save_image.call_args[0]
|
||||
assert np.array_equal(saved_image_arg, mock_rgb_img_data) # Ensure RGB data is saved
|
||||
mock_log_error.assert_not_called()
|
||||
mock_log_info.assert_any_call(f"Attempting to convert image from BGR to RGB for {file_rule.id.hex}")
|
||||
|
||||
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.resize_image')
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.calculate_target_dimensions')
|
||||
@mock.patch('processing.pipeline.stages.individual_map_processing.ipu.load_image')
|
||||
@mock.patch('pathlib.Path.glob')
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.error')
|
||||
def test_multiple_map_col_rules_processed(
|
||||
mock_log_error, mock_log_info, mock_path_glob, mock_load_image,
|
||||
mock_calc_dims, mock_resize_image, mock_save_image
|
||||
):
|
||||
stage = IndividualMapProcessingStage()
|
||||
mock_asset_source_path = Path("/fake/asset_source")
|
||||
|
||||
# Rule 1: Albedo
|
||||
ts1 = create_mock_transform_settings(target_width=100, target_height=100)
|
||||
file_rule1_id = uuid.uuid4()
|
||||
file_rule1 = create_mock_file_rule_for_individual_processing(
|
||||
id_val=file_rule1_id, map_type="ALBEDO", filename_pattern="albedo_*.png", transform_settings=ts1
|
||||
)
|
||||
source_file1 = mock_asset_source_path / "albedo_map.png"
|
||||
img_data1 = np.zeros((100, 100, 3), dtype=np.uint8)
|
||||
|
||||
# Rule 2: Roughness
|
||||
ts2 = create_mock_transform_settings(target_width=50, target_height=50) # Resize
|
||||
ts2.resize_filter = "AREA"
|
||||
file_rule2_id = uuid.uuid4()
|
||||
file_rule2 = create_mock_file_rule_for_individual_processing(
|
||||
id_val=file_rule2_id, map_type="ROUGHNESS", filename_pattern="rough_*.png", transform_settings=ts2
|
||||
)
|
||||
source_file2 = mock_asset_source_path / "rough_map.png"
|
||||
img_data2_orig = np.zeros((200, 200, 1), dtype=np.uint8) # Original, needs resize
|
||||
img_data2_resized = np.zeros((50, 50, 1), dtype=np.uint8) # Resized
|
||||
|
||||
context = create_individual_map_proc_mock_context(
|
||||
initial_file_rules=[file_rule1, file_rule2],
|
||||
asset_source_path_str=str(mock_asset_source_path)
|
||||
)
|
||||
|
||||
# Mock behaviors for Path.glob, load_image, calc_dims, resize, save
|
||||
# Path.glob will be called twice
|
||||
mock_path_glob.side_effect = [
|
||||
[source_file1], # For albedo_*.png
|
||||
[source_file2] # For rough_*.png
|
||||
]
|
||||
mock_load_image.side_effect = [img_data1, img_data2_orig]
|
||||
mock_calc_dims.side_effect = [
|
||||
(100, 100), # For rule1 (no change)
|
||||
(50, 50) # For rule2 (change)
|
||||
]
|
||||
mock_resize_image.return_value = img_data2_resized # Only called for rule2
|
||||
mock_save_image.return_value = True
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
# Assertions for Rule 1 (Albedo)
|
||||
assert mock_path_glob.call_args_list[0][0][0] == file_rule1.filename_pattern
|
||||
assert mock_load_image.call_args_list[0][0][0] == source_file1
|
||||
assert mock_calc_dims.call_args_list[0][0] == ((100,100), ts1.target_width, ts1.target_height, ts1.resize_mode, ts1.ensure_pot, ts1.allow_upscale)
|
||||
|
||||
# Assertions for Rule 2 (Roughness)
|
||||
assert mock_path_glob.call_args_list[1][0][0] == file_rule2.filename_pattern
|
||||
assert mock_load_image.call_args_list[1][0][0] == source_file2
|
||||
assert mock_calc_dims.call_args_list[1][0] == ((200,200), ts2.target_width, ts2.target_height, ts2.resize_mode, ts2.ensure_pot, ts2.allow_upscale)
|
||||
mock_resize_image.assert_called_once_with(img_data2_orig, (50,50), ts2.resize_filter)
|
||||
|
||||
assert mock_save_image.call_count == 2
|
||||
# Check saved image for rule 1
|
||||
saved_img1_arg, saved_path1_arg = mock_save_image.call_args_list[0][0]
|
||||
assert np.array_equal(saved_img1_arg, img_data1)
|
||||
assert "processed_ALBEDO_" in saved_path1_arg.name
|
||||
assert file_rule1_id.hex in saved_path1_arg.name
|
||||
|
||||
# Check saved image for rule 2
|
||||
saved_img2_arg, saved_path2_arg = mock_save_image.call_args_list[1][0]
|
||||
assert np.array_equal(saved_img2_arg, img_data2_resized)
|
||||
assert "processed_ROUGHNESS_" in saved_path2_arg.name
|
||||
assert file_rule2_id.hex in saved_path2_arg.name
|
||||
|
||||
# Check context details
|
||||
assert file_rule1_id.hex in updated_context.processed_maps_details
|
||||
details1 = updated_context.processed_maps_details[file_rule1_id.hex]
|
||||
assert details1['status'] == 'Processed'
|
||||
assert details1['original_dimensions'] == (100, 100)
|
||||
assert details1['processed_dimensions'] == (100, 100)
|
||||
|
||||
assert file_rule2_id.hex in updated_context.processed_maps_details
|
||||
details2 = updated_context.processed_maps_details[file_rule2_id.hex]
|
||||
assert details2['status'] == 'Processed'
|
||||
assert details2['original_dimensions'] == (200, 200) # Original dims of img_data2_orig
|
||||
assert details2['processed_dimensions'] == (50, 50)
|
||||
|
||||
mock_log_error.assert_not_called()
|
||||
538
tests/processing/pipeline/stages/test_map_merging.py
Normal file
538
tests/processing/pipeline/stages/test_map_merging.py
Normal file
@ -0,0 +1,538 @@
|
||||
import pytest
|
||||
from unittest import mock
|
||||
from pathlib import Path
|
||||
import uuid
|
||||
import numpy as np
|
||||
from typing import Optional # Added Optional for type hinting
|
||||
|
||||
from processing.pipeline.stages.map_merging import MapMergingStage
|
||||
from processing.pipeline.asset_context import AssetProcessingContext
|
||||
from rule_structure import AssetRule, SourceRule, FileRule, MergeSettings, MergeInputChannel
|
||||
from configuration import Configuration
|
||||
|
||||
# Mock Helper Functions
|
||||
def create_mock_merge_input_channel(
|
||||
file_rule_id: uuid.UUID, source_channel: int = 0, target_channel: int = 0, invert: bool = False
|
||||
) -> mock.MagicMock:
|
||||
mic = mock.MagicMock(spec=MergeInputChannel)
|
||||
mic.file_rule_id = file_rule_id
|
||||
mic.source_channel = source_channel
|
||||
mic.target_channel = target_channel
|
||||
mic.invert_source_channel = invert
|
||||
mic.default_value_if_missing = 0 # Or some other default
|
||||
return mic
|
||||
|
||||
def create_mock_merge_settings(
|
||||
input_maps: Optional[list] = None, # List of mock MergeInputChannel
|
||||
output_channels: int = 3
|
||||
) -> mock.MagicMock:
|
||||
ms = mock.MagicMock(spec=MergeSettings)
|
||||
ms.input_maps = input_maps if input_maps is not None else []
|
||||
ms.output_channels = output_channels
|
||||
return ms
|
||||
|
||||
def create_mock_file_rule_for_merging(
|
||||
id_val: Optional[uuid.UUID] = None,
|
||||
map_type: str = "ORM", # Output map type
|
||||
item_type: str = "MAP_MERGE",
|
||||
merge_settings: Optional[mock.MagicMock] = None
|
||||
) -> mock.MagicMock:
|
||||
mock_fr = mock.MagicMock(spec=FileRule)
|
||||
mock_fr.id = id_val if id_val else uuid.uuid4()
|
||||
mock_fr.map_type = map_type
|
||||
mock_fr.filename_pattern = f"{map_type.lower()}_merged.png" # Placeholder
|
||||
mock_fr.item_type = item_type
|
||||
mock_fr.active = True
|
||||
mock_fr.merge_settings = merge_settings if merge_settings else create_mock_merge_settings()
|
||||
return mock_fr
|
||||
|
||||
def create_map_merging_mock_context(
|
||||
initial_file_rules: Optional[list] = None, # Will contain the MAP_MERGE rule
|
||||
initial_processed_details: Optional[dict] = None, # Pre-processed inputs for merge
|
||||
skip_asset_flag: bool = False,
|
||||
asset_name: str = "MergeAsset"
|
||||
) -> AssetProcessingContext:
|
||||
mock_asset_rule = mock.MagicMock(spec=AssetRule)
|
||||
mock_asset_rule.name = asset_name
|
||||
mock_source_rule = mock.MagicMock(spec=SourceRule)
|
||||
mock_config = mock.MagicMock(spec=Configuration)
|
||||
|
||||
context = AssetProcessingContext(
|
||||
source_rule=mock_source_rule,
|
||||
asset_rule=mock_asset_rule,
|
||||
workspace_path=Path("/fake/workspace"),
|
||||
engine_temp_dir=Path("/fake/temp_engine_dir"),
|
||||
output_base_path=Path("/fake/output"),
|
||||
effective_supplier="ValidSupplier",
|
||||
asset_metadata={'asset_name': asset_name},
|
||||
processed_maps_details=initial_processed_details if initial_processed_details is not None else {},
|
||||
merged_maps_details={}, # Stage populates this
|
||||
files_to_process=list(initial_file_rules) if initial_file_rules else [],
|
||||
loaded_data_cache={},
|
||||
config_obj=mock_config,
|
||||
status_flags={'skip_asset': skip_asset_flag},
|
||||
incrementing_value=None,
|
||||
sha5_value=None # Corrected from sha5_value to sha_value based on AssetProcessingContext
|
||||
)
|
||||
return context
|
||||
def test_asset_skipped():
|
||||
stage = MapMergingStage()
|
||||
context = create_map_merging_mock_context(skip_asset_flag=True)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert updated_context == context # No changes expected
|
||||
assert not updated_context.merged_maps_details # No maps should be merged
|
||||
|
||||
def test_no_map_merge_rules():
|
||||
stage = MapMergingStage()
|
||||
# Context with a non-MAP_MERGE rule
|
||||
non_merge_rule = create_mock_file_rule_for_merging(item_type="TEXTURE_MAP", map_type="Diffuse")
|
||||
context = create_map_merging_mock_context(initial_file_rules=[non_merge_rule])
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert updated_context == context # No changes expected
|
||||
assert not updated_context.merged_maps_details # No maps should be merged
|
||||
|
||||
@mock.patch('processing.pipeline.stages.map_merging.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.map_merging.ipu.resize_image') # If testing resize
|
||||
@mock.patch('processing.pipeline.stages.map_merging.ipu.load_image')
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.error')
|
||||
def test_map_merging_rgb_success(mock_log_error, mock_log_info, mock_load_image, mock_resize_image, mock_save_image):
|
||||
stage = MapMergingStage()
|
||||
|
||||
# Input FileRules (mocked as already processed)
|
||||
r_id, g_id, b_id = uuid.uuid4(), uuid.uuid4(), uuid.uuid4()
|
||||
processed_details = {
|
||||
r_id.hex: {'temp_processed_file': '/fake/red.png', 'status': 'Processed', 'map_type': 'RED_SRC'},
|
||||
g_id.hex: {'temp_processed_file': '/fake/green.png', 'status': 'Processed', 'map_type': 'GREEN_SRC'},
|
||||
b_id.hex: {'temp_processed_file': '/fake/blue.png', 'status': 'Processed', 'map_type': 'BLUE_SRC'}
|
||||
}
|
||||
# Mock loaded image data (grayscale for inputs)
|
||||
mock_r_data = np.full((10, 10), 200, dtype=np.uint8)
|
||||
mock_g_data = np.full((10, 10), 100, dtype=np.uint8)
|
||||
mock_b_data = np.full((10, 10), 50, dtype=np.uint8)
|
||||
mock_load_image.side_effect = [mock_r_data, mock_g_data, mock_b_data]
|
||||
|
||||
# Merge Rule setup
|
||||
merge_inputs = [
|
||||
create_mock_merge_input_channel(file_rule_id=r_id, source_channel=0, target_channel=0), # R to R
|
||||
create_mock_merge_input_channel(file_rule_id=g_id, source_channel=0, target_channel=1), # G to G
|
||||
create_mock_merge_input_channel(file_rule_id=b_id, source_channel=0, target_channel=2) # B to B
|
||||
]
|
||||
merge_settings = create_mock_merge_settings(input_maps=merge_inputs, output_channels=3)
|
||||
merge_rule_id = uuid.uuid4()
|
||||
merge_rule = create_mock_file_rule_for_merging(id_val=merge_rule_id, map_type="RGB_Combined", merge_settings=merge_settings)
|
||||
|
||||
context = create_map_merging_mock_context(
|
||||
initial_file_rules=[merge_rule],
|
||||
initial_processed_details=processed_details
|
||||
)
|
||||
mock_save_image.return_value = True
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert mock_load_image.call_count == 3
|
||||
mock_resize_image.assert_not_called() # Assuming all inputs are same size for this test
|
||||
mock_save_image.assert_called_once()
|
||||
|
||||
# Check that the correct filename was passed to save_image
|
||||
# The filename is constructed as: f"{context.asset_rule.name}_merged_{merge_rule.map_type}{Path(first_input_path).suffix}"
|
||||
# In this case, first_input_path is '/fake/red.png', so suffix is '.png'
|
||||
# Asset name is "MergeAsset"
|
||||
expected_filename_part = f"{context.asset_rule.name}_merged_{merge_rule.map_type}.png"
|
||||
saved_path_arg = mock_save_image.call_args[0][0]
|
||||
assert expected_filename_part in str(saved_path_arg)
|
||||
|
||||
|
||||
saved_data = mock_save_image.call_args[0][1]
|
||||
assert saved_data.shape == (10, 10, 3)
|
||||
assert np.all(saved_data[:,:,0] == 200) # Red channel
|
||||
assert np.all(saved_data[:,:,1] == 100) # Green channel
|
||||
assert np.all(saved_data[:,:,2] == 50) # Blue channel
|
||||
|
||||
assert merge_rule.id.hex in updated_context.merged_maps_details
|
||||
details = updated_context.merged_maps_details[merge_rule.id.hex]
|
||||
assert details['status'] == 'Processed'
|
||||
# The temp_merged_file path will be under engine_temp_dir / asset_name / filename
|
||||
assert f"{context.engine_temp_dir / context.asset_rule.name / expected_filename_part}" == details['temp_merged_file']
|
||||
mock_log_error.assert_not_called()
|
||||
mock_log_info.assert_any_call(f"Successfully merged map '{merge_rule.map_type}' for asset '{context.asset_rule.name}'.")
|
||||
|
||||
# Unit tests will be added below this line
|
||||
@mock.patch('processing.pipeline.stages.map_merging.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.map_merging.ipu.resize_image')
|
||||
@mock.patch('processing.pipeline.stages.map_merging.ipu.load_image')
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.error')
|
||||
def test_map_merging_channel_inversion(mock_log_error, mock_log_info, mock_load_image, mock_resize_image, mock_save_image):
|
||||
stage = MapMergingStage()
|
||||
|
||||
# Input FileRule
|
||||
input_id = uuid.uuid4()
|
||||
processed_details = {
|
||||
input_id.hex: {'temp_processed_file': '/fake/source.png', 'status': 'Processed', 'map_type': 'SOURCE_MAP'}
|
||||
}
|
||||
# Mock loaded image data (single channel for simplicity, to be inverted)
|
||||
mock_source_data = np.array([[0, 100], [155, 255]], dtype=np.uint8)
|
||||
mock_load_image.return_value = mock_source_data
|
||||
|
||||
# Merge Rule setup: one input, inverted, to one output channel
|
||||
merge_inputs = [
|
||||
create_mock_merge_input_channel(file_rule_id=input_id, source_channel=0, target_channel=0, invert=True)
|
||||
]
|
||||
merge_settings = create_mock_merge_settings(input_maps=merge_inputs, output_channels=1)
|
||||
merge_rule_id = uuid.uuid4()
|
||||
merge_rule = create_mock_file_rule_for_merging(id_val=merge_rule_id, map_type="Inverted_Gray", merge_settings=merge_settings)
|
||||
|
||||
context = create_map_merging_mock_context(
|
||||
initial_file_rules=[merge_rule],
|
||||
initial_processed_details=processed_details
|
||||
)
|
||||
mock_save_image.return_value = True
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_called_once_with(Path('/fake/source.png'))
|
||||
mock_resize_image.assert_not_called()
|
||||
mock_save_image.assert_called_once()
|
||||
|
||||
saved_data = mock_save_image.call_args[0][1]
|
||||
assert saved_data.shape == (2, 2) # Grayscale output
|
||||
|
||||
# Expected inverted data: 255-original
|
||||
expected_inverted_data = np.array([[255, 155], [100, 0]], dtype=np.uint8)
|
||||
assert np.all(saved_data == expected_inverted_data)
|
||||
|
||||
assert merge_rule.id.hex in updated_context.merged_maps_details
|
||||
details = updated_context.merged_maps_details[merge_rule.id.hex]
|
||||
assert details['status'] == 'Processed'
|
||||
assert "merged_Inverted_Gray" in details['temp_merged_file']
|
||||
mock_log_error.assert_not_called()
|
||||
mock_log_info.assert_any_call(f"Successfully merged map '{merge_rule.map_type}' for asset '{context.asset_rule.name}'.")
|
||||
@mock.patch('processing.pipeline.stages.map_merging.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.map_merging.ipu.load_image')
|
||||
@mock.patch('logging.error')
|
||||
def test_map_merging_input_map_missing(mock_log_error, mock_load_image, mock_save_image):
|
||||
stage = MapMergingStage()
|
||||
|
||||
# Input FileRule ID that will be missing from processed_details
|
||||
missing_input_id = uuid.uuid4()
|
||||
|
||||
# Merge Rule setup
|
||||
merge_inputs = [
|
||||
create_mock_merge_input_channel(file_rule_id=missing_input_id, source_channel=0, target_channel=0)
|
||||
]
|
||||
merge_settings = create_mock_merge_settings(input_maps=merge_inputs, output_channels=1)
|
||||
merge_rule_id = uuid.uuid4()
|
||||
merge_rule = create_mock_file_rule_for_merging(id_val=merge_rule_id, map_type="TestMissing", merge_settings=merge_settings)
|
||||
|
||||
# processed_details is empty, so missing_input_id will not be found
|
||||
context = create_map_merging_mock_context(
|
||||
initial_file_rules=[merge_rule],
|
||||
initial_processed_details={}
|
||||
)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_not_called()
|
||||
mock_save_image.assert_not_called()
|
||||
|
||||
assert merge_rule.id.hex in updated_context.merged_maps_details
|
||||
details = updated_context.merged_maps_details[merge_rule.id.hex]
|
||||
assert details['status'] == 'Failed'
|
||||
assert 'error_message' in details
|
||||
assert f"Input map FileRule ID {missing_input_id.hex} not found in processed_maps_details or not successfully processed" in details['error_message']
|
||||
|
||||
mock_log_error.assert_called_once()
|
||||
assert f"Failed to merge map '{merge_rule.map_type}' for asset '{context.asset_rule.name}'" in mock_log_error.call_args[0][0]
|
||||
assert f"Input map FileRule ID {missing_input_id.hex} not found in processed_maps_details or not successfully processed" in mock_log_error.call_args[0][0]
|
||||
|
||||
@mock.patch('processing.pipeline.stages.map_merging.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.map_merging.ipu.load_image')
|
||||
@mock.patch('logging.error')
|
||||
def test_map_merging_input_map_status_not_processed(mock_log_error, mock_load_image, mock_save_image):
|
||||
stage = MapMergingStage()
|
||||
|
||||
input_id = uuid.uuid4()
|
||||
processed_details = {
|
||||
# Status is 'Failed', not 'Processed'
|
||||
input_id.hex: {'temp_processed_file': '/fake/source.png', 'status': 'Failed', 'map_type': 'SOURCE_MAP'}
|
||||
}
|
||||
|
||||
merge_inputs = [
|
||||
create_mock_merge_input_channel(file_rule_id=input_id, source_channel=0, target_channel=0)
|
||||
]
|
||||
merge_settings = create_mock_merge_settings(input_maps=merge_inputs, output_channels=1)
|
||||
merge_rule_id = uuid.uuid4()
|
||||
merge_rule = create_mock_file_rule_for_merging(id_val=merge_rule_id, map_type="TestNotProcessed", merge_settings=merge_settings)
|
||||
|
||||
context = create_map_merging_mock_context(
|
||||
initial_file_rules=[merge_rule],
|
||||
initial_processed_details=processed_details
|
||||
)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_not_called()
|
||||
mock_save_image.assert_not_called()
|
||||
|
||||
assert merge_rule.id.hex in updated_context.merged_maps_details
|
||||
details = updated_context.merged_maps_details[merge_rule.id.hex]
|
||||
assert details['status'] == 'Failed'
|
||||
assert 'error_message' in details
|
||||
assert f"Input map FileRule ID {input_id.hex} not found in processed_maps_details or not successfully processed" in details['error_message']
|
||||
|
||||
mock_log_error.assert_called_once()
|
||||
assert f"Failed to merge map '{merge_rule.map_type}' for asset '{context.asset_rule.name}'" in mock_log_error.call_args[0][0]
|
||||
assert f"Input map FileRule ID {input_id.hex} not found in processed_maps_details or not successfully processed" in mock_log_error.call_args[0][0]
|
||||
@mock.patch('processing.pipeline.stages.map_merging.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.map_merging.ipu.load_image')
|
||||
@mock.patch('logging.error')
|
||||
def test_map_merging_load_image_fails(mock_log_error, mock_load_image, mock_save_image):
|
||||
stage = MapMergingStage()
|
||||
|
||||
input_id = uuid.uuid4()
|
||||
processed_details = {
|
||||
input_id.hex: {'temp_processed_file': '/fake/source.png', 'status': 'Processed', 'map_type': 'SOURCE_MAP'}
|
||||
}
|
||||
|
||||
# Configure mock_load_image to raise an exception
|
||||
mock_load_image.side_effect = Exception("Failed to load image")
|
||||
|
||||
merge_inputs = [
|
||||
create_mock_merge_input_channel(file_rule_id=input_id, source_channel=0, target_channel=0)
|
||||
]
|
||||
merge_settings = create_mock_merge_settings(input_maps=merge_inputs, output_channels=1)
|
||||
merge_rule_id = uuid.uuid4()
|
||||
merge_rule = create_mock_file_rule_for_merging(id_val=merge_rule_id, map_type="TestLoadFail", merge_settings=merge_settings)
|
||||
|
||||
context = create_map_merging_mock_context(
|
||||
initial_file_rules=[merge_rule],
|
||||
initial_processed_details=processed_details
|
||||
)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_called_once_with(Path('/fake/source.png'))
|
||||
mock_save_image.assert_not_called()
|
||||
|
||||
assert merge_rule.id.hex in updated_context.merged_maps_details
|
||||
details = updated_context.merged_maps_details[merge_rule.id.hex]
|
||||
assert details['status'] == 'Failed'
|
||||
assert 'error_message' in details
|
||||
assert "Failed to load image for merge input" in details['error_message']
|
||||
assert str(Path('/fake/source.png')) in details['error_message']
|
||||
|
||||
mock_log_error.assert_called_once()
|
||||
assert f"Failed to merge map '{merge_rule.map_type}' for asset '{context.asset_rule.name}'" in mock_log_error.call_args[0][0]
|
||||
assert "Failed to load image for merge input" in mock_log_error.call_args[0][0]
|
||||
@mock.patch('processing.pipeline.stages.map_merging.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.map_merging.ipu.load_image')
|
||||
@mock.patch('logging.error')
|
||||
def test_map_merging_save_image_fails(mock_log_error, mock_load_image, mock_save_image):
|
||||
stage = MapMergingStage()
|
||||
|
||||
input_id = uuid.uuid4()
|
||||
processed_details = {
|
||||
input_id.hex: {'temp_processed_file': '/fake/source.png', 'status': 'Processed', 'map_type': 'SOURCE_MAP'}
|
||||
}
|
||||
mock_source_data = np.full((10, 10), 128, dtype=np.uint8)
|
||||
mock_load_image.return_value = mock_source_data
|
||||
|
||||
# Configure mock_save_image to return False (indicating failure)
|
||||
mock_save_image.return_value = False
|
||||
|
||||
merge_inputs = [
|
||||
create_mock_merge_input_channel(file_rule_id=input_id, source_channel=0, target_channel=0)
|
||||
]
|
||||
merge_settings = create_mock_merge_settings(input_maps=merge_inputs, output_channels=1)
|
||||
merge_rule_id = uuid.uuid4()
|
||||
merge_rule = create_mock_file_rule_for_merging(id_val=merge_rule_id, map_type="TestSaveFail", merge_settings=merge_settings)
|
||||
|
||||
context = create_map_merging_mock_context(
|
||||
initial_file_rules=[merge_rule],
|
||||
initial_processed_details=processed_details
|
||||
)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_called_once_with(Path('/fake/source.png'))
|
||||
mock_save_image.assert_called_once() # save_image is called, but returns False
|
||||
|
||||
assert merge_rule.id.hex in updated_context.merged_maps_details
|
||||
details = updated_context.merged_maps_details[merge_rule.id.hex]
|
||||
assert details['status'] == 'Failed'
|
||||
assert 'error_message' in details
|
||||
assert "Failed to save merged map" in details['error_message']
|
||||
|
||||
mock_log_error.assert_called_once()
|
||||
assert f"Failed to merge map '{merge_rule.map_type}' for asset '{context.asset_rule.name}'" in mock_log_error.call_args[0][0]
|
||||
assert "Failed to save merged map" in mock_log_error.call_args[0][0]
|
||||
@mock.patch('processing.pipeline.stages.map_merging.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.map_merging.ipu.resize_image')
|
||||
@mock.patch('processing.pipeline.stages.map_merging.ipu.load_image')
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.error')
|
||||
def test_map_merging_dimension_mismatch_handling(mock_log_error, mock_log_info, mock_load_image, mock_resize_image, mock_save_image):
|
||||
stage = MapMergingStage()
|
||||
|
||||
# Input FileRules
|
||||
id1, id2 = uuid.uuid4(), uuid.uuid4()
|
||||
processed_details = {
|
||||
id1.hex: {'temp_processed_file': '/fake/img1.png', 'status': 'Processed', 'map_type': 'IMG1_SRC'},
|
||||
id2.hex: {'temp_processed_file': '/fake/img2.png', 'status': 'Processed', 'map_type': 'IMG2_SRC'}
|
||||
}
|
||||
|
||||
# Mock loaded image data with different dimensions
|
||||
mock_img1_data = np.full((10, 10), 100, dtype=np.uint8) # 10x10
|
||||
mock_img2_data_original = np.full((5, 5), 200, dtype=np.uint8) # 5x5, will be resized
|
||||
|
||||
mock_load_image.side_effect = [mock_img1_data, mock_img2_data_original]
|
||||
|
||||
# Mock resize_image to return an image of the target dimensions
|
||||
# For simplicity, it just creates a new array of the target size filled with a value.
|
||||
mock_img2_data_resized = np.full((10, 10), 210, dtype=np.uint8) # Resized to 10x10
|
||||
mock_resize_image.return_value = mock_img2_data_resized
|
||||
|
||||
# Merge Rule setup: two inputs, one output channel (e.g., averaging them)
|
||||
# Target channel 0 for both, the stage should handle combining them if they map to the same target.
|
||||
# However, the current stage logic for multiple inputs to the same target channel is to take the last one.
|
||||
# Let's make them target different channels for a clearer test of resize.
|
||||
merge_inputs = [
|
||||
create_mock_merge_input_channel(file_rule_id=id1, source_channel=0, target_channel=0),
|
||||
create_mock_merge_input_channel(file_rule_id=id2, source_channel=0, target_channel=1)
|
||||
]
|
||||
merge_settings = create_mock_merge_settings(input_maps=merge_inputs, output_channels=2) # Outputting 2 channels
|
||||
merge_rule_id = uuid.uuid4()
|
||||
merge_rule = create_mock_file_rule_for_merging(id_val=merge_rule_id, map_type="ResizedMerge", merge_settings=merge_settings)
|
||||
|
||||
context = create_map_merging_mock_context(
|
||||
initial_file_rules=[merge_rule],
|
||||
initial_processed_details=processed_details
|
||||
)
|
||||
mock_save_image.return_value = True
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert mock_load_image.call_count == 2
|
||||
mock_load_image.assert_any_call(Path('/fake/img1.png'))
|
||||
mock_load_image.assert_any_call(Path('/fake/img2.png'))
|
||||
|
||||
# Assert resize_image was called for the second image to match the first's dimensions
|
||||
mock_resize_image.assert_called_once()
|
||||
# The first argument to resize_image is the image data, second is target_shape tuple (height, width)
|
||||
# np.array_equal is needed for comparing numpy arrays in mock calls
|
||||
assert np.array_equal(mock_resize_image.call_args[0][0], mock_img2_data_original)
|
||||
assert mock_resize_image.call_args[0][1] == (10, 10)
|
||||
|
||||
mock_save_image.assert_called_once()
|
||||
|
||||
saved_data = mock_save_image.call_args[0][1]
|
||||
assert saved_data.shape == (10, 10, 2) # 2 output channels
|
||||
assert np.all(saved_data[:,:,0] == mock_img1_data) # First channel from img1
|
||||
assert np.all(saved_data[:,:,1] == mock_img2_data_resized) # Second channel from resized img2
|
||||
|
||||
assert merge_rule.id.hex in updated_context.merged_maps_details
|
||||
details = updated_context.merged_maps_details[merge_rule.id.hex]
|
||||
assert details['status'] == 'Processed'
|
||||
assert "merged_ResizedMerge" in details['temp_merged_file']
|
||||
mock_log_error.assert_not_called()
|
||||
mock_log_info.assert_any_call(f"Resized input map from {Path('/fake/img2.png')} from {mock_img2_data_original.shape} to {(10,10)} to match first loaded map.")
|
||||
mock_log_info.assert_any_call(f"Successfully merged map '{merge_rule.map_type}' for asset '{context.asset_rule.name}'.")
|
||||
@mock.patch('processing.pipeline.stages.map_merging.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.map_merging.ipu.resize_image')
|
||||
@mock.patch('processing.pipeline.stages.map_merging.ipu.load_image')
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.error')
|
||||
def test_map_merging_to_grayscale_output(mock_log_error, mock_log_info, mock_load_image, mock_resize_image, mock_save_image):
|
||||
stage = MapMergingStage()
|
||||
|
||||
# Input FileRule (e.g., an RGB image)
|
||||
input_id = uuid.uuid4()
|
||||
processed_details = {
|
||||
input_id.hex: {'temp_processed_file': '/fake/rgb_source.png', 'status': 'Processed', 'map_type': 'RGB_SRC'}
|
||||
}
|
||||
# Mock loaded image data (3 channels)
|
||||
mock_rgb_data = np.full((10, 10, 3), [50, 100, 150], dtype=np.uint8)
|
||||
mock_load_image.return_value = mock_rgb_data
|
||||
|
||||
# Merge Rule setup: take the Green channel (source_channel=1) from input and map it to the single output channel (target_channel=0)
|
||||
merge_inputs = [
|
||||
create_mock_merge_input_channel(file_rule_id=input_id, source_channel=1, target_channel=0) # G to Grayscale
|
||||
]
|
||||
# output_channels = 1 for grayscale
|
||||
merge_settings = create_mock_merge_settings(input_maps=merge_inputs, output_channels=1)
|
||||
merge_rule_id = uuid.uuid4()
|
||||
merge_rule = create_mock_file_rule_for_merging(id_val=merge_rule_id, map_type="GrayscaleFromGreen", merge_settings=merge_settings)
|
||||
|
||||
context = create_map_merging_mock_context(
|
||||
initial_file_rules=[merge_rule],
|
||||
initial_processed_details=processed_details
|
||||
)
|
||||
mock_save_image.return_value = True
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_called_once_with(Path('/fake/rgb_source.png'))
|
||||
mock_resize_image.assert_not_called()
|
||||
mock_save_image.assert_called_once()
|
||||
|
||||
saved_data = mock_save_image.call_args[0][1]
|
||||
assert saved_data.shape == (10, 10) # Grayscale output (2D)
|
||||
assert np.all(saved_data == 100) # Green channel's value
|
||||
|
||||
assert merge_rule.id.hex in updated_context.merged_maps_details
|
||||
details = updated_context.merged_maps_details[merge_rule.id.hex]
|
||||
assert details['status'] == 'Processed'
|
||||
assert "merged_GrayscaleFromGreen" in details['temp_merged_file']
|
||||
mock_log_error.assert_not_called()
|
||||
mock_log_info.assert_any_call(f"Successfully merged map '{merge_rule.map_type}' for asset '{context.asset_rule.name}'.")
|
||||
|
||||
@mock.patch('processing.pipeline.stages.map_merging.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.map_merging.ipu.load_image')
|
||||
@mock.patch('logging.error')
|
||||
def test_map_merging_default_value_if_missing_channel(mock_log_error, mock_load_image, mock_save_image):
|
||||
stage = MapMergingStage()
|
||||
|
||||
input_id = uuid.uuid4()
|
||||
processed_details = {
|
||||
# Input is a grayscale image (1 channel)
|
||||
input_id.hex: {'temp_processed_file': '/fake/gray_source.png', 'status': 'Processed', 'map_type': 'GRAY_SRC'}
|
||||
}
|
||||
mock_gray_data = np.full((10, 10), 50, dtype=np.uint8)
|
||||
mock_load_image.return_value = mock_gray_data
|
||||
|
||||
# Merge Rule: try to read source_channel 1 (which doesn't exist in grayscale)
|
||||
# and use default_value_if_missing for target_channel 0.
|
||||
# Also, read source_channel 0 (which exists) for target_channel 1.
|
||||
mic1 = create_mock_merge_input_channel(file_rule_id=input_id, source_channel=1, target_channel=0)
|
||||
mic1.default_value_if_missing = 128 # Set a specific default value
|
||||
mic2 = create_mock_merge_input_channel(file_rule_id=input_id, source_channel=0, target_channel=1)
|
||||
|
||||
merge_settings = create_mock_merge_settings(input_maps=[mic1, mic2], output_channels=2)
|
||||
merge_rule_id = uuid.uuid4()
|
||||
merge_rule = create_mock_file_rule_for_merging(id_val=merge_rule_id, map_type="DefaultValueTest", merge_settings=merge_settings)
|
||||
|
||||
context = create_map_merging_mock_context(
|
||||
initial_file_rules=[merge_rule],
|
||||
initial_processed_details=processed_details
|
||||
)
|
||||
mock_save_image.return_value = True
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_called_once_with(Path('/fake/gray_source.png'))
|
||||
mock_save_image.assert_called_once()
|
||||
|
||||
saved_data = mock_save_image.call_args[0][1]
|
||||
assert saved_data.shape == (10, 10, 2)
|
||||
assert np.all(saved_data[:,:,0] == 128) # Default value for missing source channel 1
|
||||
assert np.all(saved_data[:,:,1] == 50) # Value from existing source channel 0
|
||||
|
||||
assert merge_rule.id.hex in updated_context.merged_maps_details
|
||||
details = updated_context.merged_maps_details[merge_rule.id.hex]
|
||||
assert details['status'] == 'Processed'
|
||||
mock_log_error.assert_not_called()
|
||||
@ -0,0 +1,359 @@
|
||||
import pytest
|
||||
from unittest import mock
|
||||
from pathlib import Path
|
||||
import datetime
|
||||
import json # For comparing dumped content
|
||||
import uuid
|
||||
from typing import Optional, Dict, Any
|
||||
|
||||
from processing.pipeline.stages.metadata_finalization_save import MetadataFinalizationAndSaveStage
|
||||
from processing.pipeline.asset_context import AssetProcessingContext
|
||||
from rule_structure import AssetRule, SourceRule
|
||||
from configuration import Configuration, GeneralSettings # Added GeneralSettings as it's in the helper
|
||||
|
||||
|
||||
def create_metadata_save_mock_context(
|
||||
status_flags: Optional[Dict[str, Any]] = None,
|
||||
initial_asset_metadata: Optional[Dict[str, Any]] = None,
|
||||
processed_details: Optional[Dict[str, Any]] = None,
|
||||
merged_details: Optional[Dict[str, Any]] = None,
|
||||
asset_name: str = "MetaSaveAsset",
|
||||
output_path_pattern_val: str = "{asset_name}/metadata/{filename}",
|
||||
# ... other common context fields ...
|
||||
) -> AssetProcessingContext:
|
||||
mock_asset_rule = mock.MagicMock(spec=AssetRule)
|
||||
mock_asset_rule.name = asset_name
|
||||
mock_asset_rule.output_path_pattern = output_path_pattern_val
|
||||
mock_asset_rule.id = uuid.uuid4() # Needed for generate_path_from_pattern if it uses it
|
||||
|
||||
mock_source_rule = mock.MagicMock(spec=SourceRule)
|
||||
mock_source_rule.name = "MetaSaveSource"
|
||||
|
||||
mock_config = mock.MagicMock(spec=Configuration)
|
||||
# mock_config.general_settings = mock.MagicMock(spec=GeneralSettings) # If needed
|
||||
|
||||
context = AssetProcessingContext(
|
||||
source_rule=mock_source_rule,
|
||||
asset_rule=mock_asset_rule,
|
||||
workspace_path=Path("/fake/workspace"),
|
||||
engine_temp_dir=Path("/fake/temp_engine_dir"),
|
||||
output_base_path=Path("/fake/output_base"), # For generate_path
|
||||
effective_supplier="ValidSupplier",
|
||||
asset_metadata=initial_asset_metadata if initial_asset_metadata is not None else {},
|
||||
processed_maps_details=processed_details if processed_details is not None else {},
|
||||
merged_maps_details=merged_details if merged_details is not None else {},
|
||||
files_to_process=[],
|
||||
loaded_data_cache={},
|
||||
config_obj=mock_config,
|
||||
status_flags=status_flags if status_flags is not None else {},
|
||||
incrementing_value="001", # Example for path generation
|
||||
sha5_value="abc" # Example for path generation
|
||||
)
|
||||
return context
|
||||
@mock.patch('processing.pipeline.stages.metadata_finalization_save.json.dump')
|
||||
@mock.patch('builtins.open', new_callable=mock.mock_open)
|
||||
@mock.patch('pathlib.Path.mkdir')
|
||||
@mock.patch('processing.pipeline.stages.metadata_finalization_save.generate_path_from_pattern')
|
||||
@mock.patch('datetime.datetime')
|
||||
def test_asset_skipped_before_metadata_init(
|
||||
mock_dt, mock_gen_path, mock_mkdir, mock_file_open, mock_json_dump
|
||||
):
|
||||
"""
|
||||
Tests that if an asset is marked for skipping and has no initial metadata,
|
||||
the stage returns early without attempting to save metadata.
|
||||
"""
|
||||
stage = MetadataFinalizationAndSaveStage()
|
||||
context = create_metadata_save_mock_context(
|
||||
status_flags={'skip_asset': True},
|
||||
initial_asset_metadata={} # Explicitly empty
|
||||
)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
# Assert that no processing or saving attempts were made
|
||||
mock_dt.now.assert_not_called() # Should not even try to set end time if no metadata
|
||||
mock_gen_path.assert_not_called()
|
||||
mock_mkdir.assert_not_called()
|
||||
mock_file_open.assert_not_called()
|
||||
mock_json_dump.assert_not_called()
|
||||
|
||||
assert updated_context.asset_metadata == {} # Metadata remains empty
|
||||
assert 'metadata_file_path' not in updated_context.asset_metadata
|
||||
assert updated_context.status_flags.get('metadata_save_error') is None
|
||||
@mock.patch('processing.pipeline.stages.metadata_finalization_save.json.dump')
|
||||
@mock.patch('builtins.open', new_callable=mock.mock_open)
|
||||
@mock.patch('pathlib.Path.mkdir')
|
||||
@mock.patch('processing.pipeline.stages.metadata_finalization_save.generate_path_from_pattern')
|
||||
@mock.patch('datetime.datetime')
|
||||
def test_asset_skipped_after_metadata_init(
|
||||
mock_dt, mock_gen_path, mock_mkdir, mock_file_open, mock_json_dump
|
||||
):
|
||||
"""
|
||||
Tests that if an asset is marked for skipping but has initial metadata,
|
||||
the status is updated to 'Skipped' and metadata is saved.
|
||||
"""
|
||||
stage = MetadataFinalizationAndSaveStage()
|
||||
|
||||
fixed_now = datetime.datetime(2023, 1, 1, 12, 0, 0)
|
||||
mock_dt.now.return_value = fixed_now
|
||||
|
||||
fake_metadata_path_str = "/fake/output_base/SkippedAsset/metadata/SkippedAsset_metadata.json"
|
||||
mock_gen_path.return_value = fake_metadata_path_str
|
||||
|
||||
initial_meta = {'asset_name': "SkippedAsset", 'status': "Pending"}
|
||||
|
||||
context = create_metadata_save_mock_context(
|
||||
asset_name="SkippedAsset",
|
||||
status_flags={'skip_asset': True},
|
||||
initial_asset_metadata=initial_meta
|
||||
)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_dt.now.assert_called_once()
|
||||
mock_gen_path.assert_called_once_with(
|
||||
context.asset_rule.output_path_pattern,
|
||||
context.asset_rule,
|
||||
context.source_rule,
|
||||
context.output_base_path,
|
||||
context.asset_metadata, # Original metadata passed for path gen
|
||||
context.incrementing_value,
|
||||
context.sha5_value,
|
||||
filename_override=f"{context.asset_rule.name}_metadata.json"
|
||||
)
|
||||
mock_mkdir.assert_called_once_with(parents=True, exist_ok=True)
|
||||
mock_file_open.assert_called_once_with(Path(fake_metadata_path_str), 'w')
|
||||
mock_json_dump.assert_called_once()
|
||||
|
||||
dumped_data = mock_json_dump.call_args[0][0]
|
||||
assert dumped_data['status'] == "Skipped"
|
||||
assert dumped_data['processing_end_time'] == fixed_now.isoformat()
|
||||
assert 'processed_map_details' not in dumped_data # Should not be present if skipped early
|
||||
assert 'merged_map_details' not in dumped_data # Should not be present if skipped early
|
||||
|
||||
assert updated_context.asset_metadata['status'] == "Skipped"
|
||||
assert updated_context.asset_metadata['processing_end_time'] == fixed_now.isoformat()
|
||||
assert updated_context.asset_metadata['metadata_file_path'] == fake_metadata_path_str
|
||||
assert updated_context.status_flags.get('metadata_save_error') is None
|
||||
@mock.patch('processing.pipeline.stages.metadata_finalization_save.json.dump')
|
||||
@mock.patch('builtins.open', new_callable=mock.mock_open) # Mocks open()
|
||||
@mock.patch('pathlib.Path.mkdir')
|
||||
@mock.patch('processing.pipeline.stages.metadata_finalization_save.generate_path_from_pattern')
|
||||
@mock.patch('datetime.datetime')
|
||||
def test_metadata_save_success(mock_dt, mock_gen_path, mock_mkdir, mock_file_open, mock_json_dump):
|
||||
"""
|
||||
Tests successful metadata finalization and saving, including serialization of Path objects.
|
||||
"""
|
||||
stage = MetadataFinalizationAndSaveStage()
|
||||
|
||||
fixed_now = datetime.datetime(2023, 1, 1, 12, 30, 0)
|
||||
mock_dt.now.return_value = fixed_now
|
||||
|
||||
fake_metadata_path_str = "/fake/output_base/MetaSaveAsset/metadata/MetaSaveAsset_metadata.json"
|
||||
mock_gen_path.return_value = fake_metadata_path_str
|
||||
|
||||
initial_meta = {'asset_name': "MetaSaveAsset", 'status': "Pending", 'processing_start_time': "2023-01-01T12:00:00"}
|
||||
# Example of a Path object that needs serialization
|
||||
proc_details = {'map1': {'temp_processed_file': Path('/fake/temp_engine_dir/map1.png'), 'final_file_path': Path('/fake/output_base/MetaSaveAsset/map1.png')}}
|
||||
merged_details = {'merged_map_A': {'output_path': Path('/fake/output_base/MetaSaveAsset/merged_A.png')}}
|
||||
|
||||
context = create_metadata_save_mock_context(
|
||||
initial_asset_metadata=initial_meta,
|
||||
processed_details=proc_details,
|
||||
merged_details=merged_details,
|
||||
status_flags={} # No errors, no skip
|
||||
)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_dt.now.assert_called_once()
|
||||
mock_gen_path.assert_called_once_with(
|
||||
context.asset_rule.output_path_pattern,
|
||||
context.asset_rule,
|
||||
context.source_rule,
|
||||
context.output_base_path,
|
||||
context.asset_metadata, # The metadata *before* adding end_time, status etc.
|
||||
context.incrementing_value,
|
||||
context.sha5_value,
|
||||
filename_override=f"{context.asset_rule.name}_metadata.json"
|
||||
)
|
||||
mock_mkdir.assert_called_once_with(parents=True, exist_ok=True) # Checks parent dir of fake_metadata_path_str
|
||||
mock_file_open.assert_called_once_with(Path(fake_metadata_path_str), 'w')
|
||||
mock_json_dump.assert_called_once()
|
||||
|
||||
# Check what was passed to json.dump
|
||||
dumped_data = mock_json_dump.call_args[0][0]
|
||||
assert dumped_data['status'] == "Processed"
|
||||
assert dumped_data['processing_end_time'] == fixed_now.isoformat()
|
||||
assert 'processing_start_time' in dumped_data # Ensure existing fields are preserved
|
||||
|
||||
# Verify processed_map_details and Path serialization
|
||||
assert 'processed_map_details' in dumped_data
|
||||
assert dumped_data['processed_map_details']['map1']['temp_processed_file'] == '/fake/temp_engine_dir/map1.png'
|
||||
assert dumped_data['processed_map_details']['map1']['final_file_path'] == '/fake/output_base/MetaSaveAsset/map1.png'
|
||||
|
||||
# Verify merged_map_details and Path serialization
|
||||
assert 'merged_map_details' in dumped_data
|
||||
assert dumped_data['merged_map_details']['merged_map_A']['output_path'] == '/fake/output_base/MetaSaveAsset/merged_A.png'
|
||||
|
||||
assert updated_context.asset_metadata['metadata_file_path'] == fake_metadata_path_str
|
||||
assert updated_context.asset_metadata['status'] == "Processed"
|
||||
assert updated_context.status_flags.get('metadata_save_error') is None
|
||||
@mock.patch('processing.pipeline.stages.metadata_finalization_save.json.dump')
|
||||
@mock.patch('builtins.open', new_callable=mock.mock_open)
|
||||
@mock.patch('pathlib.Path.mkdir')
|
||||
@mock.patch('processing.pipeline.stages.metadata_finalization_save.generate_path_from_pattern')
|
||||
@mock.patch('datetime.datetime')
|
||||
def test_processing_failed_due_to_previous_error(
|
||||
mock_dt, mock_gen_path, mock_mkdir, mock_file_open, mock_json_dump
|
||||
):
|
||||
"""
|
||||
Tests that if a previous stage set an error flag, the status is 'Failed'
|
||||
and metadata (including any existing details) is saved.
|
||||
"""
|
||||
stage = MetadataFinalizationAndSaveStage()
|
||||
|
||||
fixed_now = datetime.datetime(2023, 1, 1, 12, 45, 0)
|
||||
mock_dt.now.return_value = fixed_now
|
||||
|
||||
fake_metadata_path_str = "/fake/output_base/FailedAsset/metadata/FailedAsset_metadata.json"
|
||||
mock_gen_path.return_value = fake_metadata_path_str
|
||||
|
||||
initial_meta = {'asset_name': "FailedAsset", 'status': "Processing"}
|
||||
# Simulate some details might exist even if a later stage failed
|
||||
proc_details = {'map1_partial': {'temp_processed_file': Path('/fake/temp_engine_dir/map1_partial.png')}}
|
||||
|
||||
context = create_metadata_save_mock_context(
|
||||
asset_name="FailedAsset",
|
||||
initial_asset_metadata=initial_meta,
|
||||
processed_details=proc_details,
|
||||
merged_details={}, # No merged details if processing failed before that
|
||||
status_flags={'file_processing_error': True, 'error_message': "Something went wrong"}
|
||||
)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_dt.now.assert_called_once()
|
||||
mock_gen_path.assert_called_once() # Path generation should still occur
|
||||
mock_mkdir.assert_called_once_with(parents=True, exist_ok=True)
|
||||
mock_file_open.assert_called_once_with(Path(fake_metadata_path_str), 'w')
|
||||
mock_json_dump.assert_called_once()
|
||||
|
||||
dumped_data = mock_json_dump.call_args[0][0]
|
||||
assert dumped_data['status'] == "Failed"
|
||||
assert dumped_data['processing_end_time'] == fixed_now.isoformat()
|
||||
assert 'error_message' in dumped_data # Assuming error messages from status_flags are copied
|
||||
assert dumped_data['error_message'] == "Something went wrong"
|
||||
|
||||
# Check that existing details are included
|
||||
assert 'processed_map_details' in dumped_data
|
||||
assert dumped_data['processed_map_details']['map1_partial']['temp_processed_file'] == '/fake/temp_engine_dir/map1_partial.png'
|
||||
assert 'merged_map_details' in dumped_data # Should be present, even if empty
|
||||
assert dumped_data['merged_map_details'] == {}
|
||||
|
||||
assert updated_context.asset_metadata['status'] == "Failed"
|
||||
assert updated_context.asset_metadata['metadata_file_path'] == fake_metadata_path_str
|
||||
assert updated_context.status_flags.get('metadata_save_error') is None
|
||||
# Ensure the original error flag is preserved
|
||||
assert updated_context.status_flags['file_processing_error'] is True
|
||||
@mock.patch('processing.pipeline.stages.metadata_finalization_save.json.dump')
|
||||
@mock.patch('builtins.open', new_callable=mock.mock_open)
|
||||
@mock.patch('pathlib.Path.mkdir')
|
||||
@mock.patch('processing.pipeline.stages.metadata_finalization_save.generate_path_from_pattern')
|
||||
@mock.patch('datetime.datetime')
|
||||
@mock.patch('logging.error') # To check if error is logged
|
||||
def test_generate_path_fails(
|
||||
mock_log_error, mock_dt, mock_gen_path, mock_mkdir, mock_file_open, mock_json_dump
|
||||
):
|
||||
"""
|
||||
Tests behavior when generate_path_from_pattern raises an exception.
|
||||
Ensures status is updated, error flag is set, and no save is attempted.
|
||||
"""
|
||||
stage = MetadataFinalizationAndSaveStage()
|
||||
|
||||
fixed_now = datetime.datetime(2023, 1, 1, 12, 50, 0)
|
||||
mock_dt.now.return_value = fixed_now
|
||||
|
||||
mock_gen_path.side_effect = Exception("Simulated path generation error")
|
||||
|
||||
initial_meta = {'asset_name': "PathFailAsset", 'status': "Processing"}
|
||||
context = create_metadata_save_mock_context(
|
||||
asset_name="PathFailAsset",
|
||||
initial_asset_metadata=initial_meta,
|
||||
status_flags={}
|
||||
)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_dt.now.assert_called_once() # Time is set before path generation
|
||||
mock_gen_path.assert_called_once() # generate_path_from_pattern is called
|
||||
|
||||
# File operations should NOT be called if path generation fails
|
||||
mock_mkdir.assert_not_called()
|
||||
mock_file_open.assert_not_called()
|
||||
mock_json_dump.assert_not_called()
|
||||
|
||||
mock_log_error.assert_called_once() # Check that an error was logged
|
||||
# Example: check if the log message contains relevant info, if needed
|
||||
# assert "Failed to generate metadata path" in mock_log_error.call_args[0][0]
|
||||
|
||||
assert updated_context.asset_metadata['status'] == "Failed" # Or a more specific error status
|
||||
assert 'processing_end_time' in updated_context.asset_metadata # End time should still be set
|
||||
assert updated_context.asset_metadata['processing_end_time'] == fixed_now.isoformat()
|
||||
assert 'metadata_file_path' not in updated_context.asset_metadata # Path should not be set
|
||||
|
||||
assert updated_context.status_flags.get('metadata_save_error') is True
|
||||
assert 'error_message' in updated_context.asset_metadata # Check if error message is populated
|
||||
assert "Simulated path generation error" in updated_context.asset_metadata['error_message']
|
||||
@mock.patch('processing.pipeline.stages.metadata_finalization_save.json.dump')
|
||||
@mock.patch('builtins.open', new_callable=mock.mock_open)
|
||||
@mock.patch('pathlib.Path.mkdir')
|
||||
@mock.patch('processing.pipeline.stages.metadata_finalization_save.generate_path_from_pattern')
|
||||
@mock.patch('datetime.datetime')
|
||||
@mock.patch('logging.error') # To check if error is logged
|
||||
def test_json_dump_fails(
|
||||
mock_log_error, mock_dt, mock_gen_path, mock_mkdir, mock_file_open, mock_json_dump
|
||||
):
|
||||
"""
|
||||
Tests behavior when json.dump raises an exception during saving.
|
||||
Ensures status is updated, error flag is set, and error is logged.
|
||||
"""
|
||||
stage = MetadataFinalizationAndSaveStage()
|
||||
|
||||
fixed_now = datetime.datetime(2023, 1, 1, 12, 55, 0)
|
||||
mock_dt.now.return_value = fixed_now
|
||||
|
||||
fake_metadata_path_str = "/fake/output_base/JsonDumpFailAsset/metadata/JsonDumpFailAsset_metadata.json"
|
||||
mock_gen_path.return_value = fake_metadata_path_str
|
||||
|
||||
mock_json_dump.side_effect = IOError("Simulated JSON dump error") # Or TypeError for non-serializable
|
||||
|
||||
initial_meta = {'asset_name': "JsonDumpFailAsset", 'status': "Processing"}
|
||||
context = create_metadata_save_mock_context(
|
||||
asset_name="JsonDumpFailAsset",
|
||||
initial_asset_metadata=initial_meta,
|
||||
status_flags={}
|
||||
)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_dt.now.assert_called_once()
|
||||
mock_gen_path.assert_called_once()
|
||||
mock_mkdir.assert_called_once_with(parents=True, exist_ok=True)
|
||||
mock_file_open.assert_called_once_with(Path(fake_metadata_path_str), 'w')
|
||||
mock_json_dump.assert_called_once() # json.dump was attempted
|
||||
|
||||
mock_log_error.assert_called_once()
|
||||
# assert "Failed to save metadata JSON" in mock_log_error.call_args[0][0]
|
||||
|
||||
assert updated_context.asset_metadata['status'] == "Failed" # Or specific "Metadata Save Failed"
|
||||
assert 'processing_end_time' in updated_context.asset_metadata
|
||||
assert updated_context.asset_metadata['processing_end_time'] == fixed_now.isoformat()
|
||||
# metadata_file_path might be set if path generation succeeded, even if dump failed.
|
||||
# Depending on desired behavior, this could be asserted or not.
|
||||
# For now, let's assume it's set if path generation was successful.
|
||||
assert updated_context.asset_metadata['metadata_file_path'] == fake_metadata_path_str
|
||||
|
||||
assert updated_context.status_flags.get('metadata_save_error') is True
|
||||
assert 'error_message' in updated_context.asset_metadata
|
||||
assert "Simulated JSON dump error" in updated_context.asset_metadata['error_message']
|
||||
169
tests/processing/pipeline/stages/test_metadata_initialization.py
Normal file
169
tests/processing/pipeline/stages/test_metadata_initialization.py
Normal file
@ -0,0 +1,169 @@
|
||||
import pytest
|
||||
from unittest import mock
|
||||
from pathlib import Path
|
||||
import datetime
|
||||
import uuid
|
||||
from typing import Optional
|
||||
|
||||
from processing.pipeline.stages.metadata_initialization import MetadataInitializationStage
|
||||
from processing.pipeline.asset_context import AssetProcessingContext
|
||||
from rule_structure import AssetRule, SourceRule
|
||||
from configuration import Configuration, GeneralSettings
|
||||
|
||||
# Helper function to create a mock AssetProcessingContext
|
||||
def create_metadata_init_mock_context(
|
||||
skip_asset_flag: bool = False,
|
||||
asset_name: str = "MetaAsset",
|
||||
asset_id: uuid.UUID = None, # Allow None to default to uuid.uuid4()
|
||||
source_path_str: str = "source/meta_asset",
|
||||
output_pattern: str = "{asset_name}/{map_type}",
|
||||
tags: list = None,
|
||||
custom_fields: dict = None,
|
||||
source_rule_name: str = "MetaSource",
|
||||
source_rule_id: uuid.UUID = None, # Allow None to default to uuid.uuid4()
|
||||
eff_supplier: Optional[str] = "SupplierMeta",
|
||||
app_version_str: str = "1.0.0-test",
|
||||
inc_val: Optional[str] = None,
|
||||
sha_val: Optional[str] = None
|
||||
) -> AssetProcessingContext:
|
||||
mock_asset_rule = mock.MagicMock(spec=AssetRule)
|
||||
mock_asset_rule.name = asset_name
|
||||
mock_asset_rule.id = asset_id if asset_id is not None else uuid.uuid4()
|
||||
mock_asset_rule.source_path = Path(source_path_str)
|
||||
mock_asset_rule.output_path_pattern = output_pattern
|
||||
mock_asset_rule.tags = tags if tags is not None else ["tag1", "test_tag"]
|
||||
mock_asset_rule.custom_fields = custom_fields if custom_fields is not None else {"custom_key": "custom_value"}
|
||||
|
||||
mock_source_rule = mock.MagicMock(spec=SourceRule)
|
||||
mock_source_rule.name = source_rule_name
|
||||
mock_source_rule.id = source_rule_id if source_rule_id is not None else uuid.uuid4()
|
||||
|
||||
mock_general_settings = mock.MagicMock(spec=GeneralSettings)
|
||||
mock_general_settings.app_version = app_version_str
|
||||
|
||||
mock_config = mock.MagicMock(spec=Configuration)
|
||||
mock_config.general_settings = mock_general_settings
|
||||
|
||||
context = AssetProcessingContext(
|
||||
source_rule=mock_source_rule,
|
||||
asset_rule=mock_asset_rule,
|
||||
workspace_path=Path("/fake/workspace"),
|
||||
engine_temp_dir=Path("/fake/temp"),
|
||||
output_base_path=Path("/fake/output"),
|
||||
effective_supplier=eff_supplier,
|
||||
asset_metadata={},
|
||||
processed_maps_details={},
|
||||
merged_maps_details={},
|
||||
files_to_process=[],
|
||||
loaded_data_cache={},
|
||||
config_obj=mock_config,
|
||||
status_flags={'skip_asset': skip_asset_flag},
|
||||
incrementing_value=inc_val,
|
||||
sha5_value=sha_val
|
||||
)
|
||||
return context
|
||||
|
||||
@mock.patch('processing.pipeline.stages.metadata_initialization.datetime')
|
||||
def test_metadata_initialization_not_skipped(mock_datetime_module):
|
||||
stage = MetadataInitializationStage()
|
||||
|
||||
fixed_now = datetime.datetime(2023, 10, 26, 12, 0, 0, tzinfo=datetime.timezone.utc)
|
||||
mock_datetime_module.datetime.now.return_value = fixed_now
|
||||
|
||||
asset_id_val = uuid.uuid4()
|
||||
source_id_val = uuid.uuid4()
|
||||
|
||||
context = create_metadata_init_mock_context(
|
||||
skip_asset_flag=False,
|
||||
asset_id=asset_id_val,
|
||||
source_rule_id=source_id_val,
|
||||
inc_val="001",
|
||||
sha_val="abcde"
|
||||
)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert isinstance(updated_context.asset_metadata, dict)
|
||||
assert isinstance(updated_context.processed_maps_details, dict)
|
||||
assert isinstance(updated_context.merged_maps_details, dict)
|
||||
|
||||
md = updated_context.asset_metadata
|
||||
assert md['asset_name'] == "MetaAsset"
|
||||
assert md['asset_id'] == str(asset_id_val)
|
||||
assert md['source_rule_name'] == "MetaSource"
|
||||
assert md['source_rule_id'] == str(source_id_val)
|
||||
assert md['source_path'] == "source/meta_asset"
|
||||
assert md['effective_supplier'] == "SupplierMeta"
|
||||
assert md['output_path_pattern'] == "{asset_name}/{map_type}"
|
||||
assert md['processing_start_time'] == fixed_now.isoformat()
|
||||
assert md['status'] == "Pending"
|
||||
assert md['version'] == "1.0.0-test"
|
||||
assert md['tags'] == ["tag1", "test_tag"]
|
||||
assert md['custom_fields'] == {"custom_key": "custom_value"}
|
||||
assert md['incrementing_value'] == "001"
|
||||
assert md['sha5_value'] == "abcde"
|
||||
|
||||
@mock.patch('processing.pipeline.stages.metadata_initialization.datetime')
|
||||
def test_metadata_initialization_not_skipped_none_inc_sha(mock_datetime_module):
|
||||
stage = MetadataInitializationStage()
|
||||
|
||||
fixed_now = datetime.datetime(2023, 10, 26, 12, 0, 0, tzinfo=datetime.timezone.utc)
|
||||
mock_datetime_module.datetime.now.return_value = fixed_now
|
||||
|
||||
context = create_metadata_init_mock_context(
|
||||
skip_asset_flag=False,
|
||||
inc_val=None,
|
||||
sha_val=None
|
||||
)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
md = updated_context.asset_metadata
|
||||
assert 'incrementing_value' not in md # Or assert md['incrementing_value'] is None, depending on desired behavior
|
||||
assert 'sha5_value' not in md # Or assert md['sha5_value'] is None
|
||||
|
||||
def test_metadata_initialization_skipped():
|
||||
stage = MetadataInitializationStage()
|
||||
context = create_metadata_init_mock_context(skip_asset_flag=True)
|
||||
|
||||
# Make copies of initial state to ensure they are not modified
|
||||
initial_asset_metadata = dict(context.asset_metadata)
|
||||
initial_processed_maps = dict(context.processed_maps_details)
|
||||
initial_merged_maps = dict(context.merged_maps_details)
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert updated_context.asset_metadata == initial_asset_metadata
|
||||
assert updated_context.processed_maps_details == initial_processed_maps
|
||||
assert updated_context.merged_maps_details == initial_merged_maps
|
||||
assert not updated_context.asset_metadata # Explicitly check it's empty as per initial setup
|
||||
assert not updated_context.processed_maps_details
|
||||
assert not updated_context.merged_maps_details
|
||||
|
||||
@mock.patch('processing.pipeline.stages.metadata_initialization.datetime')
|
||||
def test_tags_and_custom_fields_are_copies(mock_datetime_module):
|
||||
stage = MetadataInitializationStage()
|
||||
fixed_now = datetime.datetime(2023, 10, 26, 12, 0, 0, tzinfo=datetime.timezone.utc)
|
||||
mock_datetime_module.datetime.now.return_value = fixed_now
|
||||
|
||||
original_tags = ["original_tag"]
|
||||
original_custom_fields = {"original_key": "original_value"}
|
||||
|
||||
context = create_metadata_init_mock_context(
|
||||
skip_asset_flag=False,
|
||||
tags=original_tags,
|
||||
custom_fields=original_custom_fields
|
||||
)
|
||||
|
||||
# Modify originals after context creation but before stage execution
|
||||
original_tags.append("modified_after_creation")
|
||||
original_custom_fields["new_key_after_creation"] = "new_value"
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
md = updated_context.asset_metadata
|
||||
assert md['tags'] == ["original_tag"] # Should not have "modified_after_creation"
|
||||
assert md['tags'] is not original_tags # Ensure it's a different object
|
||||
|
||||
assert md['custom_fields'] == {"original_key": "original_value"} # Should not have "new_key_after_creation"
|
||||
assert md['custom_fields'] is not original_custom_fields # Ensure it's a different object
|
||||
@ -0,0 +1,323 @@
|
||||
import pytest
|
||||
from unittest import mock
|
||||
from pathlib import Path
|
||||
import uuid
|
||||
import numpy as np
|
||||
import logging # Added for mocking logger
|
||||
|
||||
from processing.pipeline.stages.normal_map_green_channel import NormalMapGreenChannelStage
|
||||
from processing.pipeline.asset_context import AssetProcessingContext
|
||||
from rule_structure import AssetRule, SourceRule, FileRule
|
||||
from configuration import Configuration, GeneralSettings
|
||||
|
||||
# Helper functions
|
||||
def create_mock_file_rule_for_normal_test(
|
||||
id_val: uuid.UUID = None, # Corrected type hint from Optional[uuid.UUID]
|
||||
map_type: str = "NORMAL",
|
||||
filename_pattern: str = "normal.png"
|
||||
) -> mock.MagicMock:
|
||||
mock_fr = mock.MagicMock(spec=FileRule)
|
||||
mock_fr.id = id_val if id_val else uuid.uuid4()
|
||||
mock_fr.map_type = map_type
|
||||
mock_fr.filename_pattern = filename_pattern
|
||||
mock_fr.item_type = "MAP_COL" # As per example, though not directly used by stage
|
||||
mock_fr.active = True # As per example
|
||||
return mock_fr
|
||||
|
||||
def create_normal_map_mock_context(
|
||||
initial_file_rules: list = None, # Corrected type hint
|
||||
initial_processed_details: dict = None, # Corrected type hint
|
||||
invert_green_globally: bool = False,
|
||||
skip_asset_flag: bool = False,
|
||||
asset_name: str = "NormalMapAsset"
|
||||
) -> AssetProcessingContext:
|
||||
mock_asset_rule = mock.MagicMock(spec=AssetRule)
|
||||
mock_asset_rule.name = asset_name
|
||||
|
||||
mock_source_rule = mock.MagicMock(spec=SourceRule)
|
||||
|
||||
mock_gs = mock.MagicMock(spec=GeneralSettings)
|
||||
mock_gs.invert_normal_map_green_channel_globally = invert_green_globally
|
||||
|
||||
mock_config = mock.MagicMock(spec=Configuration)
|
||||
mock_config.general_settings = mock_gs
|
||||
|
||||
context = AssetProcessingContext(
|
||||
source_rule=mock_source_rule,
|
||||
asset_rule=mock_asset_rule,
|
||||
workspace_path=Path("/fake/workspace"),
|
||||
engine_temp_dir=Path("/fake/temp_engine_dir"),
|
||||
output_base_path=Path("/fake/output"),
|
||||
effective_supplier="ValidSupplier",
|
||||
asset_metadata={'asset_name': asset_name},
|
||||
processed_maps_details=initial_processed_details if initial_processed_details is not None else {},
|
||||
merged_maps_details={},
|
||||
files_to_process=list(initial_file_rules) if initial_file_rules else [],
|
||||
loaded_data_cache={},
|
||||
config_obj=mock_config,
|
||||
status_flags={'skip_asset': skip_asset_flag},
|
||||
incrementing_value=None, # Added as per AssetProcessingContext constructor
|
||||
sha5_value=None # Added as per AssetProcessingContext constructor
|
||||
)
|
||||
return context
|
||||
|
||||
# Unit tests will be added below
|
||||
@mock.patch('processing.pipeline.stages.normal_map_green_channel.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.normal_map_green_channel.ipu.load_image')
|
||||
def test_asset_skipped(mock_load_image, mock_save_image):
|
||||
stage = NormalMapGreenChannelStage()
|
||||
normal_fr = create_mock_file_rule_for_normal_test(map_type="NORMAL")
|
||||
initial_details = {
|
||||
normal_fr.id.hex: {'temp_processed_file': '/fake/temp_engine_dir/processed_normal.png', 'status': 'Processed', 'map_type': 'NORMAL', 'notes': ''}
|
||||
}
|
||||
context = create_normal_map_mock_context(
|
||||
initial_file_rules=[normal_fr],
|
||||
initial_processed_details=initial_details,
|
||||
invert_green_globally=True,
|
||||
skip_asset_flag=True # Asset is skipped
|
||||
)
|
||||
original_details = context.processed_maps_details.copy()
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_not_called()
|
||||
mock_save_image.assert_not_called()
|
||||
assert updated_context.processed_maps_details == original_details
|
||||
assert normal_fr in updated_context.files_to_process # Ensure rule is still there
|
||||
|
||||
@mock.patch('processing.pipeline.stages.normal_map_green_channel.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.normal_map_green_channel.ipu.load_image')
|
||||
def test_no_normal_map_present(mock_load_image, mock_save_image):
|
||||
stage = NormalMapGreenChannelStage()
|
||||
# Create a non-normal map rule
|
||||
diffuse_fr = create_mock_file_rule_for_normal_test(map_type="DIFFUSE", filename_pattern="diffuse.png")
|
||||
initial_details = {
|
||||
diffuse_fr.id.hex: {'temp_processed_file': '/fake/temp_engine_dir/processed_diffuse.png', 'status': 'Processed', 'map_type': 'DIFFUSE', 'notes': ''}
|
||||
}
|
||||
context = create_normal_map_mock_context(
|
||||
initial_file_rules=[diffuse_fr],
|
||||
initial_processed_details=initial_details,
|
||||
invert_green_globally=True # Inversion enabled, but no normal map
|
||||
)
|
||||
original_details = context.processed_maps_details.copy()
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_not_called()
|
||||
mock_save_image.assert_not_called()
|
||||
assert updated_context.processed_maps_details == original_details
|
||||
assert diffuse_fr in updated_context.files_to_process
|
||||
|
||||
@mock.patch('processing.pipeline.stages.normal_map_green_channel.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.normal_map_green_channel.ipu.load_image')
|
||||
def test_normal_map_present_inversion_disabled(mock_load_image, mock_save_image):
|
||||
stage = NormalMapGreenChannelStage()
|
||||
normal_rule_id = uuid.uuid4()
|
||||
normal_fr = create_mock_file_rule_for_normal_test(id_val=normal_rule_id, map_type="NORMAL")
|
||||
initial_details = {
|
||||
normal_fr.id.hex: {'temp_processed_file': '/fake/temp_engine_dir/processed_normal.png', 'status': 'Processed', 'map_type': 'NORMAL', 'notes': 'Initial note'}
|
||||
}
|
||||
context = create_normal_map_mock_context(
|
||||
initial_file_rules=[normal_fr],
|
||||
initial_processed_details=initial_details,
|
||||
invert_green_globally=False # Inversion disabled
|
||||
)
|
||||
original_details_entry = context.processed_maps_details[normal_fr.id.hex].copy()
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_not_called()
|
||||
mock_save_image.assert_not_called()
|
||||
assert updated_context.processed_maps_details[normal_fr.id.hex] == original_details_entry
|
||||
assert normal_fr in updated_context.files_to_process
|
||||
|
||||
@mock.patch('processing.pipeline.stages.normal_map_green_channel.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.normal_map_green_channel.ipu.load_image')
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.debug')
|
||||
def test_normal_map_inversion_uint8_success(mock_log_debug, mock_log_info, mock_load_image, mock_save_image):
|
||||
stage = NormalMapGreenChannelStage()
|
||||
|
||||
normal_rule_id = uuid.uuid4()
|
||||
normal_fr = create_mock_file_rule_for_normal_test(id_val=normal_rule_id, map_type="NORMAL")
|
||||
|
||||
initial_temp_path = Path('/fake/temp_engine_dir/processed_normal.png')
|
||||
initial_details = {
|
||||
normal_fr.id.hex: {'temp_processed_file': str(initial_temp_path), 'status': 'Processed', 'map_type': 'NORMAL', 'notes': 'Initial note'}
|
||||
}
|
||||
context = create_normal_map_mock_context(
|
||||
initial_file_rules=[normal_fr],
|
||||
initial_processed_details=initial_details,
|
||||
invert_green_globally=True # Enable inversion
|
||||
)
|
||||
|
||||
# R=10, G=50, B=100
|
||||
mock_loaded_normal_data = np.array([[[10, 50, 100]]], dtype=np.uint8)
|
||||
mock_load_image.return_value = mock_loaded_normal_data
|
||||
mock_save_image.return_value = True # Simulate successful save
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_called_once_with(initial_temp_path)
|
||||
|
||||
# Check that save_image was called with green channel inverted
|
||||
assert mock_save_image.call_count == 1
|
||||
saved_path_arg, saved_data_arg = mock_save_image.call_args[0]
|
||||
|
||||
assert saved_data_arg[0,0,0] == 10 # R unchanged
|
||||
assert saved_data_arg[0,0,1] == 255 - 50 # G inverted
|
||||
assert saved_data_arg[0,0,2] == 100 # B unchanged
|
||||
|
||||
assert isinstance(saved_path_arg, Path)
|
||||
assert "normal_g_inv_" in saved_path_arg.name
|
||||
assert saved_path_arg.parent == initial_temp_path.parent # Should be in same temp dir
|
||||
|
||||
normal_detail = updated_context.processed_maps_details[normal_fr.id.hex]
|
||||
assert "normal_g_inv_" in normal_detail['temp_processed_file']
|
||||
assert Path(normal_detail['temp_processed_file']).name == saved_path_arg.name
|
||||
assert "Green channel inverted" in normal_detail['notes']
|
||||
assert "Initial note" in normal_detail['notes'] # Check existing notes preserved
|
||||
|
||||
assert normal_fr in updated_context.files_to_process
|
||||
|
||||
@mock.patch('processing.pipeline.stages.normal_map_green_channel.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.normal_map_green_channel.ipu.load_image')
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.debug')
|
||||
def test_normal_map_inversion_float_success(mock_log_debug, mock_log_info, mock_load_image, mock_save_image):
|
||||
stage = NormalMapGreenChannelStage()
|
||||
normal_rule_id = uuid.uuid4()
|
||||
normal_fr = create_mock_file_rule_for_normal_test(id_val=normal_rule_id, map_type="NORMAL")
|
||||
initial_temp_path = Path('/fake/temp_engine_dir/processed_normal_float.png')
|
||||
initial_details = {
|
||||
normal_fr.id.hex: {'temp_processed_file': str(initial_temp_path), 'status': 'Processed', 'map_type': 'NORMAL', 'notes': 'Float image'}
|
||||
}
|
||||
context = create_normal_map_mock_context(
|
||||
initial_file_rules=[normal_fr],
|
||||
initial_processed_details=initial_details,
|
||||
invert_green_globally=True
|
||||
)
|
||||
|
||||
# R=0.1, G=0.25, B=0.75
|
||||
mock_loaded_normal_data = np.array([[[0.1, 0.25, 0.75]]], dtype=np.float32)
|
||||
mock_load_image.return_value = mock_loaded_normal_data
|
||||
mock_save_image.return_value = True
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_called_once_with(initial_temp_path)
|
||||
|
||||
assert mock_save_image.call_count == 1
|
||||
saved_path_arg, saved_data_arg = mock_save_image.call_args[0]
|
||||
|
||||
assert np.isclose(saved_data_arg[0,0,0], 0.1) # R unchanged
|
||||
assert np.isclose(saved_data_arg[0,0,1], 1.0 - 0.25) # G inverted
|
||||
assert np.isclose(saved_data_arg[0,0,2], 0.75) # B unchanged
|
||||
|
||||
assert "normal_g_inv_" in saved_path_arg.name
|
||||
normal_detail = updated_context.processed_maps_details[normal_fr.id.hex]
|
||||
assert "normal_g_inv_" in normal_detail['temp_processed_file']
|
||||
assert "Green channel inverted" in normal_detail['notes']
|
||||
assert "Float image" in normal_detail['notes']
|
||||
assert normal_fr in updated_context.files_to_process
|
||||
|
||||
@mock.patch('processing.pipeline.stages.normal_map_green_channel.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.normal_map_green_channel.ipu.load_image')
|
||||
@mock.patch('logging.error')
|
||||
def test_load_image_fails(mock_log_error, mock_load_image, mock_save_image):
|
||||
stage = NormalMapGreenChannelStage()
|
||||
normal_rule_id = uuid.uuid4()
|
||||
normal_fr = create_mock_file_rule_for_normal_test(id_val=normal_rule_id, map_type="NORMAL")
|
||||
initial_temp_path_str = '/fake/temp_engine_dir/processed_normal_load_fail.png'
|
||||
initial_details = {
|
||||
normal_fr.id.hex: {'temp_processed_file': initial_temp_path_str, 'status': 'Processed', 'map_type': 'NORMAL', 'notes': 'Load fail test'}
|
||||
}
|
||||
context = create_normal_map_mock_context(
|
||||
initial_file_rules=[normal_fr],
|
||||
initial_processed_details=initial_details,
|
||||
invert_green_globally=True
|
||||
)
|
||||
original_details_entry = context.processed_maps_details[normal_fr.id.hex].copy()
|
||||
|
||||
mock_load_image.return_value = None # Simulate load failure
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_called_once_with(Path(initial_temp_path_str))
|
||||
mock_save_image.assert_not_called()
|
||||
mock_log_error.assert_called_once()
|
||||
assert f"Failed to load image {Path(initial_temp_path_str)} for green channel inversion." in mock_log_error.call_args[0][0]
|
||||
|
||||
# Details should be unchanged
|
||||
assert updated_context.processed_maps_details[normal_fr.id.hex] == original_details_entry
|
||||
assert normal_fr in updated_context.files_to_process
|
||||
|
||||
@mock.patch('processing.pipeline.stages.normal_map_green_channel.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.normal_map_green_channel.ipu.load_image')
|
||||
@mock.patch('logging.error')
|
||||
def test_save_image_fails(mock_log_error, mock_load_image, mock_save_image):
|
||||
stage = NormalMapGreenChannelStage()
|
||||
normal_rule_id = uuid.uuid4()
|
||||
normal_fr = create_mock_file_rule_for_normal_test(id_val=normal_rule_id, map_type="NORMAL")
|
||||
initial_temp_path = Path('/fake/temp_engine_dir/processed_normal_save_fail.png')
|
||||
initial_details = {
|
||||
normal_fr.id.hex: {'temp_processed_file': str(initial_temp_path), 'status': 'Processed', 'map_type': 'NORMAL', 'notes': 'Save fail test'}
|
||||
}
|
||||
context = create_normal_map_mock_context(
|
||||
initial_file_rules=[normal_fr],
|
||||
initial_processed_details=initial_details,
|
||||
invert_green_globally=True
|
||||
)
|
||||
original_details_entry = context.processed_maps_details[normal_fr.id.hex].copy()
|
||||
|
||||
mock_loaded_normal_data = np.array([[[10, 50, 100]]], dtype=np.uint8)
|
||||
mock_load_image.return_value = mock_loaded_normal_data
|
||||
mock_save_image.return_value = False # Simulate save failure
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_called_once_with(initial_temp_path)
|
||||
mock_save_image.assert_called_once() # Save is attempted
|
||||
|
||||
saved_path_arg = mock_save_image.call_args[0][0] # Get the path it tried to save to
|
||||
mock_log_error.assert_called_once()
|
||||
assert f"Failed to save green channel inverted image to {saved_path_arg}." in mock_log_error.call_args[0][0]
|
||||
|
||||
# Details should be unchanged
|
||||
assert updated_context.processed_maps_details[normal_fr.id.hex] == original_details_entry
|
||||
assert normal_fr in updated_context.files_to_process
|
||||
|
||||
@mock.patch('processing.pipeline.stages.normal_map_green_channel.ipu.save_image')
|
||||
@mock.patch('processing.pipeline.stages.normal_map_green_channel.ipu.load_image')
|
||||
@mock.patch('logging.error')
|
||||
@pytest.mark.parametrize("unsuitable_data, description", [
|
||||
(np.array([[1, 2], [3, 4]], dtype=np.uint8), "2D array"), # 2D array
|
||||
(np.array([[[1, 2]]], dtype=np.uint8), "2-channel image") # Image with less than 3 channels
|
||||
])
|
||||
def test_image_not_suitable_for_inversion(mock_log_error, mock_load_image, mock_save_image, unsuitable_data, description):
|
||||
stage = NormalMapGreenChannelStage()
|
||||
normal_rule_id = uuid.uuid4()
|
||||
normal_fr = create_mock_file_rule_for_normal_test(id_val=normal_rule_id, map_type="NORMAL")
|
||||
initial_temp_path_str = f'/fake/temp_engine_dir/unsuitable_{description.replace(" ", "_")}.png'
|
||||
initial_details = {
|
||||
normal_fr.id.hex: {'temp_processed_file': initial_temp_path_str, 'status': 'Processed', 'map_type': 'NORMAL', 'notes': f'Unsuitable: {description}'}
|
||||
}
|
||||
context = create_normal_map_mock_context(
|
||||
initial_file_rules=[normal_fr],
|
||||
initial_processed_details=initial_details,
|
||||
invert_green_globally=True
|
||||
)
|
||||
original_details_entry = context.processed_maps_details[normal_fr.id.hex].copy()
|
||||
|
||||
mock_load_image.return_value = unsuitable_data
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_load_image.assert_called_once_with(Path(initial_temp_path_str))
|
||||
mock_save_image.assert_not_called() # Save should not be attempted
|
||||
mock_log_error.assert_called_once()
|
||||
assert f"Image at {Path(initial_temp_path_str)} is not suitable for green channel inversion (e.g., not RGB/RGBA)." in mock_log_error.call_args[0][0]
|
||||
|
||||
# Details should be unchanged
|
||||
assert updated_context.processed_maps_details[normal_fr.id.hex] == original_details_entry
|
||||
assert normal_fr in updated_context.files_to_process
|
||||
417
tests/processing/pipeline/stages/test_output_organization.py
Normal file
417
tests/processing/pipeline/stages/test_output_organization.py
Normal file
@ -0,0 +1,417 @@
|
||||
import pytest
|
||||
from unittest import mock
|
||||
from pathlib import Path
|
||||
import shutil # To check if shutil.copy2 is called
|
||||
import uuid
|
||||
from typing import Optional # Added for type hinting in helper
|
||||
|
||||
from processing.pipeline.stages.output_organization import OutputOrganizationStage
|
||||
from processing.pipeline.asset_context import AssetProcessingContext
|
||||
from rule_structure import AssetRule, SourceRule, FileRule # For context setup
|
||||
from configuration import Configuration, GeneralSettings
|
||||
|
||||
def create_output_org_mock_context(
|
||||
status_flags: Optional[dict] = None,
|
||||
asset_metadata_status: str = "Processed", # Default to processed for testing copy
|
||||
processed_map_details: Optional[dict] = None,
|
||||
merged_map_details: Optional[dict] = None,
|
||||
overwrite_setting: bool = False,
|
||||
asset_name: str = "OutputOrgAsset",
|
||||
output_path_pattern_val: str = "{asset_name}/{map_type}/{filename}"
|
||||
) -> AssetProcessingContext:
|
||||
mock_asset_rule = mock.MagicMock(spec=AssetRule)
|
||||
mock_asset_rule.name = asset_name
|
||||
mock_asset_rule.output_path_pattern = output_path_pattern_val
|
||||
# Need FileRules on AssetRule if stage tries to look up output_filename_pattern from them
|
||||
# For simplicity, assume stage constructs output_filename for now if not found on FileRule
|
||||
mock_asset_rule.file_rules = [] # Or mock FileRules if stage uses them for output_filename_pattern
|
||||
|
||||
mock_source_rule = mock.MagicMock(spec=SourceRule)
|
||||
mock_source_rule.name = "OutputOrgSource"
|
||||
|
||||
mock_gs = mock.MagicMock(spec=GeneralSettings)
|
||||
mock_gs.overwrite_existing = overwrite_setting
|
||||
|
||||
mock_config = mock.MagicMock(spec=Configuration)
|
||||
mock_config.general_settings = mock_gs
|
||||
|
||||
# Ensure asset_metadata has a status
|
||||
initial_asset_metadata = {'asset_name': asset_name, 'status': asset_metadata_status}
|
||||
|
||||
context = AssetProcessingContext(
|
||||
source_rule=mock_source_rule,
|
||||
asset_rule=mock_asset_rule,
|
||||
workspace_path=Path("/fake/workspace"),
|
||||
engine_temp_dir=Path("/fake/temp_engine_dir"),
|
||||
output_base_path=Path("/fake/output_final"),
|
||||
effective_supplier="ValidSupplier",
|
||||
asset_metadata=initial_asset_metadata,
|
||||
processed_maps_details=processed_map_details if processed_map_details is not None else {},
|
||||
merged_maps_details=merged_map_details if merged_map_details is not None else {},
|
||||
files_to_process=[], # Not directly used by this stage, but good to have
|
||||
loaded_data_cache={},
|
||||
config_obj=mock_config,
|
||||
status_flags=status_flags if status_flags is not None else {},
|
||||
incrementing_value="001",
|
||||
sha5_value="xyz" # Corrected from sha5_value to sha256_value if that's the actual param, or ensure it's a valid param. Assuming sha5_value is a typo and should be something like 'unique_id' or similar if not sha256. For now, keeping as sha5_value as per instructions.
|
||||
)
|
||||
return context
|
||||
@mock.patch('shutil.copy2')
|
||||
@mock.patch('logging.info') # To check for log messages
|
||||
def test_output_organization_asset_skipped_by_status_flag(mock_log_info, mock_shutil_copy):
|
||||
stage = OutputOrganizationStage()
|
||||
context = create_output_org_mock_context(status_flags={'skip_asset': True})
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_shutil_copy.assert_not_called()
|
||||
# Check if a log message indicates skipping, if applicable
|
||||
# e.g., mock_log_info.assert_any_call("Skipping output organization for asset OutputOrgAsset due to skip_asset flag.")
|
||||
assert 'final_output_files' not in updated_context.asset_metadata # Or assert it's empty
|
||||
assert updated_context.asset_metadata['status'] == "Processed" # Status should not change if skipped due to flag before stage logic
|
||||
# Add specific log check if the stage logs this event
|
||||
# For now, assume no copy is the primary check
|
||||
|
||||
@mock.patch('shutil.copy2')
|
||||
@mock.patch('logging.warning') # Or info, depending on how failure is logged
|
||||
def test_output_organization_asset_failed_by_metadata_status(mock_log_warning, mock_shutil_copy):
|
||||
stage = OutputOrganizationStage()
|
||||
context = create_output_org_mock_context(asset_metadata_status="Failed")
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_shutil_copy.assert_not_called()
|
||||
# Check for a log message indicating skipping due to failure status
|
||||
# e.g., mock_log_warning.assert_any_call("Skipping output organization for asset OutputOrgAsset as its status is Failed.")
|
||||
assert 'final_output_files' not in updated_context.asset_metadata # Or assert it's empty
|
||||
assert updated_context.asset_metadata['status'] == "Failed" # Status remains Failed
|
||||
|
||||
@mock.patch('shutil.copy2')
|
||||
@mock.patch('pathlib.Path.mkdir')
|
||||
@mock.patch('pathlib.Path.exists')
|
||||
@mock.patch('processing.pipeline.stages.output_organization.generate_path_from_pattern')
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.error')
|
||||
def test_output_organization_success_no_overwrite(
|
||||
mock_log_error, mock_log_info, mock_gen_path, mock_path_exists, mock_mkdir, mock_shutil_copy
|
||||
):
|
||||
stage = OutputOrganizationStage()
|
||||
|
||||
proc_id_1 = uuid.uuid4().hex
|
||||
merged_id_1 = uuid.uuid4().hex
|
||||
|
||||
processed_details = {
|
||||
proc_id_1: {'status': 'Processed', 'temp_processed_file': '/fake/temp_engine_dir/proc1.png', 'map_type': 'Diffuse', 'output_filename': 'OutputOrgAsset_Diffuse.png'}
|
||||
}
|
||||
merged_details = {
|
||||
merged_id_1: {'status': 'Processed', 'temp_merged_file': '/fake/temp_engine_dir/merged1.png', 'map_type': 'ORM', 'output_filename': 'OutputOrgAsset_ORM.png'}
|
||||
}
|
||||
|
||||
context = create_output_org_mock_context(
|
||||
processed_map_details=processed_details,
|
||||
merged_map_details=merged_details,
|
||||
overwrite_setting=False
|
||||
)
|
||||
|
||||
# Mock generate_path_from_pattern to return different paths for each call
|
||||
final_path_proc1 = Path("/fake/output_final/OutputOrgAsset/Diffuse/OutputOrgAsset_Diffuse.png")
|
||||
final_path_merged1 = Path("/fake/output_final/OutputOrgAsset/ORM/OutputOrgAsset_ORM.png")
|
||||
# Ensure generate_path_from_pattern is called with the correct context and details
|
||||
# The actual call in the stage is: generate_path_from_pattern(context, map_detail, map_type_key, temp_file_key)
|
||||
# We need to ensure our side_effect matches these calls.
|
||||
|
||||
def gen_path_side_effect(ctx, detail, map_type_key, temp_file_key, output_filename_key):
|
||||
if detail['temp_processed_file'] == '/fake/temp_engine_dir/proc1.png':
|
||||
return final_path_proc1
|
||||
elif detail['temp_merged_file'] == '/fake/temp_engine_dir/merged1.png':
|
||||
return final_path_merged1
|
||||
raise ValueError("Unexpected call to generate_path_from_pattern")
|
||||
|
||||
mock_gen_path.side_effect = gen_path_side_effect
|
||||
|
||||
mock_path_exists.return_value = False # Files do not exist at destination
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
assert mock_shutil_copy.call_count == 2
|
||||
mock_shutil_copy.assert_any_call(Path(processed_details[proc_id_1]['temp_processed_file']), final_path_proc1)
|
||||
mock_shutil_copy.assert_any_call(Path(merged_details[merged_id_1]['temp_merged_file']), final_path_merged1)
|
||||
|
||||
# Check mkdir calls
|
||||
# It should be called for each unique parent directory
|
||||
expected_mkdir_calls = [
|
||||
mock.call(Path("/fake/output_final/OutputOrgAsset/Diffuse"), parents=True, exist_ok=True),
|
||||
mock.call(Path("/fake/output_final/OutputOrgAsset/ORM"), parents=True, exist_ok=True)
|
||||
]
|
||||
mock_mkdir.assert_has_calls(expected_mkdir_calls, any_order=True)
|
||||
# Ensure mkdir was called for the parent of each file
|
||||
assert mock_mkdir.call_count >= 1 # Could be 1 or 2 if paths share a base that's created once
|
||||
|
||||
assert len(updated_context.asset_metadata['final_output_files']) == 2
|
||||
assert str(final_path_proc1) in updated_context.asset_metadata['final_output_files']
|
||||
assert str(final_path_merged1) in updated_context.asset_metadata['final_output_files']
|
||||
|
||||
assert updated_context.processed_maps_details[proc_id_1]['final_output_path'] == str(final_path_proc1)
|
||||
assert updated_context.merged_maps_details[merged_id_1]['final_output_path'] == str(final_path_merged1)
|
||||
mock_log_error.assert_not_called()
|
||||
# Check for specific info logs if necessary
|
||||
# mock_log_info.assert_any_call(f"Copying {processed_details[proc_id_1]['temp_processed_file']} to {final_path_proc1}")
|
||||
# mock_log_info.assert_any_call(f"Copying {merged_details[merged_id_1]['temp_merged_file']} to {final_path_merged1}")
|
||||
@mock.patch('shutil.copy2')
|
||||
@mock.patch('pathlib.Path.mkdir') # Still might be called if other files are processed
|
||||
@mock.patch('pathlib.Path.exists')
|
||||
@mock.patch('processing.pipeline.stages.output_organization.generate_path_from_pattern')
|
||||
@mock.patch('logging.info')
|
||||
def test_output_organization_overwrite_disabled_file_exists(
|
||||
mock_log_info, mock_gen_path, mock_path_exists, mock_mkdir, mock_shutil_copy
|
||||
):
|
||||
stage = OutputOrganizationStage()
|
||||
proc_id_1 = uuid.uuid4().hex
|
||||
processed_details = {
|
||||
proc_id_1: {'status': 'Processed', 'temp_processed_file': '/fake/temp_engine_dir/proc_exists.png', 'map_type': 'Diffuse', 'output_filename': 'OutputOrgAsset_Diffuse_Exists.png'}
|
||||
}
|
||||
context = create_output_org_mock_context(
|
||||
processed_map_details=processed_details,
|
||||
overwrite_setting=False
|
||||
)
|
||||
|
||||
final_path_proc1 = Path("/fake/output_final/OutputOrgAsset/Diffuse/OutputOrgAsset_Diffuse_Exists.png")
|
||||
mock_gen_path.return_value = final_path_proc1 # Only one file
|
||||
mock_path_exists.return_value = True # File exists at destination
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_shutil_copy.assert_not_called()
|
||||
mock_log_info.assert_any_call(
|
||||
f"Skipping copy for {final_path_proc1} as it already exists and overwrite is disabled."
|
||||
)
|
||||
# final_output_files should still be populated if the file exists and is considered "organized"
|
||||
assert str(final_path_proc1) in updated_context.asset_metadata['final_output_files']
|
||||
assert updated_context.processed_maps_details[proc_id_1]['final_output_path'] == str(final_path_proc1)
|
||||
|
||||
|
||||
@mock.patch('shutil.copy2')
|
||||
@mock.patch('pathlib.Path.mkdir')
|
||||
@mock.patch('pathlib.Path.exists')
|
||||
@mock.patch('processing.pipeline.stages.output_organization.generate_path_from_pattern')
|
||||
@mock.patch('logging.info')
|
||||
@mock.patch('logging.error')
|
||||
def test_output_organization_overwrite_enabled_file_exists(
|
||||
mock_log_error, mock_log_info, mock_gen_path, mock_path_exists, mock_mkdir, mock_shutil_copy
|
||||
):
|
||||
stage = OutputOrganizationStage()
|
||||
proc_id_1 = uuid.uuid4().hex
|
||||
processed_details = {
|
||||
proc_id_1: {'status': 'Processed', 'temp_processed_file': '/fake/temp_engine_dir/proc_overwrite.png', 'map_type': 'Diffuse', 'output_filename': 'OutputOrgAsset_Diffuse_Overwrite.png'}
|
||||
}
|
||||
context = create_output_org_mock_context(
|
||||
processed_map_details=processed_details,
|
||||
overwrite_setting=True # Overwrite is enabled
|
||||
)
|
||||
|
||||
final_path_proc1 = Path("/fake/output_final/OutputOrgAsset/Diffuse/OutputOrgAsset_Diffuse_Overwrite.png")
|
||||
mock_gen_path.return_value = final_path_proc1
|
||||
mock_path_exists.return_value = True # File exists, but we should overwrite
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_shutil_copy.assert_called_once_with(Path(processed_details[proc_id_1]['temp_processed_file']), final_path_proc1)
|
||||
mock_mkdir.assert_called_once_with(final_path_proc1.parent, parents=True, exist_ok=True)
|
||||
assert str(final_path_proc1) in updated_context.asset_metadata['final_output_files']
|
||||
assert updated_context.processed_maps_details[proc_id_1]['final_output_path'] == str(final_path_proc1)
|
||||
mock_log_error.assert_not_called()
|
||||
# Optionally check for a log message indicating overwrite, if implemented
|
||||
# mock_log_info.assert_any_call(f"Overwriting existing file {final_path_proc1}...")
|
||||
|
||||
|
||||
@mock.patch('shutil.copy2')
|
||||
@mock.patch('pathlib.Path.mkdir')
|
||||
@mock.patch('pathlib.Path.exists')
|
||||
@mock.patch('processing.pipeline.stages.output_organization.generate_path_from_pattern')
|
||||
@mock.patch('logging.error')
|
||||
def test_output_organization_only_processed_maps(
|
||||
mock_log_error, mock_gen_path, mock_path_exists, mock_mkdir, mock_shutil_copy
|
||||
):
|
||||
stage = OutputOrganizationStage()
|
||||
proc_id_1 = uuid.uuid4().hex
|
||||
processed_details = {
|
||||
proc_id_1: {'status': 'Processed', 'temp_processed_file': '/fake/temp_engine_dir/proc_only.png', 'map_type': 'Albedo', 'output_filename': 'OutputOrgAsset_Albedo.png'}
|
||||
}
|
||||
context = create_output_org_mock_context(
|
||||
processed_map_details=processed_details,
|
||||
merged_map_details={}, # No merged maps
|
||||
overwrite_setting=False
|
||||
)
|
||||
|
||||
final_path_proc1 = Path("/fake/output_final/OutputOrgAsset/Albedo/OutputOrgAsset_Albedo.png")
|
||||
mock_gen_path.return_value = final_path_proc1
|
||||
mock_path_exists.return_value = False
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_shutil_copy.assert_called_once_with(Path(processed_details[proc_id_1]['temp_processed_file']), final_path_proc1)
|
||||
mock_mkdir.assert_called_once_with(final_path_proc1.parent, parents=True, exist_ok=True)
|
||||
assert len(updated_context.asset_metadata['final_output_files']) == 1
|
||||
assert str(final_path_proc1) in updated_context.asset_metadata['final_output_files']
|
||||
assert updated_context.processed_maps_details[proc_id_1]['final_output_path'] == str(final_path_proc1)
|
||||
assert not updated_context.merged_maps_details # Should remain empty
|
||||
mock_log_error.assert_not_called()
|
||||
|
||||
@mock.patch('shutil.copy2')
|
||||
@mock.patch('pathlib.Path.mkdir')
|
||||
@mock.patch('pathlib.Path.exists')
|
||||
@mock.patch('processing.pipeline.stages.output_organization.generate_path_from_pattern')
|
||||
@mock.patch('logging.error')
|
||||
def test_output_organization_only_merged_maps(
|
||||
mock_log_error, mock_gen_path, mock_path_exists, mock_mkdir, mock_shutil_copy
|
||||
):
|
||||
stage = OutputOrganizationStage()
|
||||
merged_id_1 = uuid.uuid4().hex
|
||||
merged_details = {
|
||||
merged_id_1: {'status': 'Processed', 'temp_merged_file': '/fake/temp_engine_dir/merged_only.png', 'map_type': 'Metallic', 'output_filename': 'OutputOrgAsset_Metallic.png'}
|
||||
}
|
||||
context = create_output_org_mock_context(
|
||||
processed_map_details={}, # No processed maps
|
||||
merged_map_details=merged_details,
|
||||
overwrite_setting=False
|
||||
)
|
||||
|
||||
final_path_merged1 = Path("/fake/output_final/OutputOrgAsset/Metallic/OutputOrgAsset_Metallic.png")
|
||||
mock_gen_path.return_value = final_path_merged1
|
||||
mock_path_exists.return_value = False
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_shutil_copy.assert_called_once_with(Path(merged_details[merged_id_1]['temp_merged_file']), final_path_merged1)
|
||||
mock_mkdir.assert_called_once_with(final_path_merged1.parent, parents=True, exist_ok=True)
|
||||
assert len(updated_context.asset_metadata['final_output_files']) == 1
|
||||
assert str(final_path_merged1) in updated_context.asset_metadata['final_output_files']
|
||||
assert updated_context.merged_maps_details[merged_id_1]['final_output_path'] == str(final_path_merged1)
|
||||
assert not updated_context.processed_maps_details # Should remain empty
|
||||
mock_log_error.assert_not_called()
|
||||
|
||||
@mock.patch('shutil.copy2')
|
||||
@mock.patch('pathlib.Path.mkdir')
|
||||
@mock.patch('pathlib.Path.exists')
|
||||
@mock.patch('processing.pipeline.stages.output_organization.generate_path_from_pattern')
|
||||
@mock.patch('logging.warning') # Expect a warning for skipped map
|
||||
@mock.patch('logging.error')
|
||||
def test_output_organization_map_status_not_processed(
|
||||
mock_log_error, mock_log_warning, mock_gen_path, mock_path_exists, mock_mkdir, mock_shutil_copy
|
||||
):
|
||||
stage = OutputOrganizationStage()
|
||||
|
||||
proc_id_1_failed = uuid.uuid4().hex
|
||||
proc_id_2_ok = uuid.uuid4().hex
|
||||
|
||||
processed_details = {
|
||||
proc_id_1_failed: {'status': 'Failed', 'temp_processed_file': '/fake/temp_engine_dir/proc_failed.png', 'map_type': 'Diffuse', 'output_filename': 'OutputOrgAsset_Diffuse_Failed.png'},
|
||||
proc_id_2_ok: {'status': 'Processed', 'temp_processed_file': '/fake/temp_engine_dir/proc_ok.png', 'map_type': 'Normal', 'output_filename': 'OutputOrgAsset_Normal_OK.png'}
|
||||
}
|
||||
context = create_output_org_mock_context(
|
||||
processed_map_details=processed_details,
|
||||
overwrite_setting=False
|
||||
)
|
||||
|
||||
final_path_proc_ok = Path("/fake/output_final/OutputOrgAsset/Normal/OutputOrgAsset_Normal_OK.png")
|
||||
# generate_path_from_pattern should only be called for the 'Processed' map
|
||||
mock_gen_path.return_value = final_path_proc_ok
|
||||
mock_path_exists.return_value = False
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
# Assert copy was only called for the 'Processed' map
|
||||
mock_shutil_copy.assert_called_once_with(Path(processed_details[proc_id_2_ok]['temp_processed_file']), final_path_proc_ok)
|
||||
mock_mkdir.assert_called_once_with(final_path_proc_ok.parent, parents=True, exist_ok=True)
|
||||
|
||||
# Assert final_output_files only contains the successfully processed map
|
||||
assert len(updated_context.asset_metadata['final_output_files']) == 1
|
||||
assert str(final_path_proc_ok) in updated_context.asset_metadata['final_output_files']
|
||||
|
||||
# Assert final_output_path is set for the processed map
|
||||
assert updated_context.processed_maps_details[proc_id_2_ok]['final_output_path'] == str(final_path_proc_ok)
|
||||
# Assert final_output_path is NOT set for the failed map
|
||||
assert 'final_output_path' not in updated_context.processed_maps_details[proc_id_1_failed]
|
||||
|
||||
mock_log_warning.assert_any_call(
|
||||
f"Skipping output organization for map with ID {proc_id_1_failed} (type: Diffuse) as its status is 'Failed'."
|
||||
)
|
||||
mock_log_error.assert_not_called()
|
||||
@mock.patch('shutil.copy2')
|
||||
@mock.patch('pathlib.Path.mkdir')
|
||||
@mock.patch('pathlib.Path.exists')
|
||||
@mock.patch('processing.pipeline.stages.output_organization.generate_path_from_pattern')
|
||||
@mock.patch('logging.error')
|
||||
def test_output_organization_generate_path_fails(
|
||||
mock_log_error, mock_gen_path, mock_path_exists, mock_mkdir, mock_shutil_copy
|
||||
):
|
||||
stage = OutputOrganizationStage()
|
||||
proc_id_1 = uuid.uuid4().hex
|
||||
processed_details = {
|
||||
proc_id_1: {'status': 'Processed', 'temp_processed_file': '/fake/temp_engine_dir/proc_path_fail.png', 'map_type': 'Roughness', 'output_filename': 'OutputOrgAsset_Roughness_PathFail.png'}
|
||||
}
|
||||
context = create_output_org_mock_context(
|
||||
processed_map_details=processed_details,
|
||||
overwrite_setting=False
|
||||
)
|
||||
|
||||
mock_gen_path.side_effect = Exception("Simulated path generation error")
|
||||
mock_path_exists.return_value = False # Should not matter if path gen fails
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_shutil_copy.assert_not_called() # No copy if path generation fails
|
||||
mock_mkdir.assert_not_called() # No mkdir if path generation fails
|
||||
|
||||
assert not updated_context.asset_metadata.get('final_output_files') # No files should be listed
|
||||
assert 'final_output_path' not in updated_context.processed_maps_details[proc_id_1]
|
||||
|
||||
assert updated_context.status_flags.get('output_organization_error') is True
|
||||
assert updated_context.asset_metadata['status'] == "Error" # Or "Failed" depending on desired behavior
|
||||
|
||||
mock_log_error.assert_any_call(
|
||||
f"Error generating output path for map ID {proc_id_1} (type: Roughness): Simulated path generation error"
|
||||
)
|
||||
|
||||
@mock.patch('shutil.copy2')
|
||||
@mock.patch('pathlib.Path.mkdir')
|
||||
@mock.patch('pathlib.Path.exists')
|
||||
@mock.patch('processing.pipeline.stages.output_organization.generate_path_from_pattern')
|
||||
@mock.patch('logging.error')
|
||||
def test_output_organization_shutil_copy_fails(
|
||||
mock_log_error, mock_gen_path, mock_path_exists, mock_mkdir, mock_shutil_copy
|
||||
):
|
||||
stage = OutputOrganizationStage()
|
||||
proc_id_1 = uuid.uuid4().hex
|
||||
processed_details = {
|
||||
proc_id_1: {'status': 'Processed', 'temp_processed_file': '/fake/temp_engine_dir/proc_copy_fail.png', 'map_type': 'AO', 'output_filename': 'OutputOrgAsset_AO_CopyFail.png'}
|
||||
}
|
||||
context = create_output_org_mock_context(
|
||||
processed_map_details=processed_details,
|
||||
overwrite_setting=False
|
||||
)
|
||||
|
||||
final_path_proc1 = Path("/fake/output_final/OutputOrgAsset/AO/OutputOrgAsset_AO_CopyFail.png")
|
||||
mock_gen_path.return_value = final_path_proc1
|
||||
mock_path_exists.return_value = False
|
||||
mock_shutil_copy.side_effect = shutil.Error("Simulated copy error") # Can also be IOError, OSError
|
||||
|
||||
updated_context = stage.execute(context)
|
||||
|
||||
mock_mkdir.assert_called_once_with(final_path_proc1.parent, parents=True, exist_ok=True) # mkdir would be called before copy
|
||||
mock_shutil_copy.assert_called_once_with(Path(processed_details[proc_id_1]['temp_processed_file']), final_path_proc1)
|
||||
|
||||
# Even if copy fails, the path might be added to final_output_files before the error is caught,
|
||||
# or the design might be to not add it. Let's assume it's not added on error.
|
||||
# Check the stage's actual behavior for this.
|
||||
# If the intention is to record the *attempted* path, this assertion might change.
|
||||
# For now, assume failure means it's not a "final" output.
|
||||
assert not updated_context.asset_metadata.get('final_output_files')
|
||||
assert 'final_output_path' not in updated_context.processed_maps_details[proc_id_1] # Or it might contain the path but status is error
|
||||
|
||||
assert updated_context.status_flags.get('output_organization_error') is True
|
||||
assert updated_context.asset_metadata['status'] == "Error" # Or "Failed"
|
||||
|
||||
mock_log_error.assert_any_call(
|
||||
f"Error copying file {processed_details[proc_id_1]['temp_processed_file']} to {final_path_proc1}: Simulated copy error"
|
||||
)
|
||||
213
tests/processing/pipeline/stages/test_supplier_determination.py
Normal file
213
tests/processing/pipeline/stages/test_supplier_determination.py
Normal file
@ -0,0 +1,213 @@
|
||||
import pytest
|
||||
from unittest import mock
|
||||
from pathlib import Path
|
||||
from typing import Dict, List, Optional, Any
|
||||
|
||||
# Assuming pytest is run from project root, adjust if necessary
|
||||
from processing.pipeline.stages.supplier_determination import SupplierDeterminationStage
|
||||
from processing.pipeline.asset_context import AssetProcessingContext
|
||||
from rule_structure import AssetRule, SourceRule, FileRule # For constructing mock context
|
||||
from configuration import Configuration, GeneralSettings, Supplier # For mock config
|
||||
|
||||
# Example helper (can be a pytest fixture too)
|
||||
def create_mock_context(
|
||||
asset_rule_supplier_override: Optional[str] = None,
|
||||
source_rule_supplier: Optional[str] = None,
|
||||
config_suppliers: Optional[Dict[str, Any]] = None, # Mocked Supplier objects or dicts
|
||||
asset_name: str = "TestAsset"
|
||||
) -> AssetProcessingContext:
|
||||
mock_asset_rule = mock.MagicMock(spec=AssetRule)
|
||||
mock_asset_rule.name = asset_name
|
||||
mock_asset_rule.supplier_override = asset_rule_supplier_override
|
||||
# ... other AssetRule fields if needed by the stage ...
|
||||
|
||||
mock_source_rule = mock.MagicMock(spec=SourceRule)
|
||||
mock_source_rule.supplier = source_rule_supplier
|
||||
# ... other SourceRule fields ...
|
||||
|
||||
mock_config = mock.MagicMock(spec=Configuration)
|
||||
mock_config.suppliers = config_suppliers if config_suppliers is not None else {}
|
||||
|
||||
# Basic AssetProcessingContext fields
|
||||
context = AssetProcessingContext(
|
||||
source_rule=mock_source_rule,
|
||||
asset_rule=mock_asset_rule,
|
||||
workspace_path=Path("/fake/workspace"),
|
||||
engine_temp_dir=Path("/fake/temp"),
|
||||
output_base_path=Path("/fake/output"),
|
||||
effective_supplier=None,
|
||||
asset_metadata={},
|
||||
processed_maps_details={},
|
||||
merged_maps_details={},
|
||||
files_to_process=[],
|
||||
loaded_data_cache={},
|
||||
config_obj=mock_config,
|
||||
status_flags={},
|
||||
incrementing_value=None,
|
||||
sha5_value=None # Corrected from sha5_value to sha256_value if that's the actual field name
|
||||
)
|
||||
return context
|
||||
|
||||
@pytest.fixture
|
||||
def supplier_stage():
|
||||
return SupplierDeterminationStage()
|
||||
|
||||
@mock.patch('logging.error')
|
||||
@mock.patch('logging.info')
|
||||
def test_supplier_from_asset_rule_override_valid(mock_log_info, mock_log_error, supplier_stage):
|
||||
mock_suppliers_config = {"SupplierA": mock.MagicMock(spec=Supplier)}
|
||||
context = create_mock_context(
|
||||
asset_rule_supplier_override="SupplierA",
|
||||
config_suppliers=mock_suppliers_config
|
||||
)
|
||||
|
||||
updated_context = supplier_stage.execute(context)
|
||||
|
||||
assert updated_context.effective_supplier == "SupplierA"
|
||||
assert not updated_context.status_flags.get('supplier_error')
|
||||
mock_log_info.assert_any_call("Effective supplier for asset 'TestAsset' set to 'SupplierA' from asset rule override.")
|
||||
mock_log_error.assert_not_called()
|
||||
|
||||
@mock.patch('logging.error')
|
||||
@mock.patch('logging.info')
|
||||
def test_supplier_from_source_rule_fallback_valid(mock_log_info, mock_log_error, supplier_stage):
|
||||
mock_suppliers_config = {"SupplierB": mock.MagicMock(spec=Supplier)}
|
||||
context = create_mock_context(
|
||||
asset_rule_supplier_override=None,
|
||||
source_rule_supplier="SupplierB",
|
||||
config_suppliers=mock_suppliers_config
|
||||
)
|
||||
|
||||
updated_context = supplier_stage.execute(context)
|
||||
|
||||
assert updated_context.effective_supplier == "SupplierB"
|
||||
assert not updated_context.status_flags.get('supplier_error')
|
||||
mock_log_info.assert_any_call("Effective supplier for asset 'TestAsset' set to 'SupplierB' from source rule.")
|
||||
mock_log_error.assert_not_called()
|
||||
|
||||
@mock.patch('logging.error')
|
||||
@mock.patch('logging.warning') # supplier_determination uses logging.warning for invalid suppliers
|
||||
def test_asset_rule_override_invalid_supplier(mock_log_warning, mock_log_error, supplier_stage):
|
||||
context = create_mock_context(
|
||||
asset_rule_supplier_override="InvalidSupplier",
|
||||
config_suppliers={"SupplierA": mock.MagicMock(spec=Supplier)} # "InvalidSupplier" not in config
|
||||
)
|
||||
|
||||
updated_context = supplier_stage.execute(context)
|
||||
|
||||
assert updated_context.effective_supplier is None
|
||||
assert updated_context.status_flags.get('supplier_error') is True
|
||||
mock_log_warning.assert_any_call(
|
||||
"Asset 'TestAsset' has supplier_override 'InvalidSupplier' which is not defined in global suppliers. No supplier set."
|
||||
)
|
||||
mock_log_error.assert_not_called()
|
||||
|
||||
|
||||
@mock.patch('logging.error')
|
||||
@mock.patch('logging.warning')
|
||||
def test_source_rule_fallback_invalid_supplier(mock_log_warning, mock_log_error, supplier_stage):
|
||||
context = create_mock_context(
|
||||
asset_rule_supplier_override=None,
|
||||
source_rule_supplier="InvalidSupplierB",
|
||||
config_suppliers={"SupplierA": mock.MagicMock(spec=Supplier)} # "InvalidSupplierB" not in config
|
||||
)
|
||||
|
||||
updated_context = supplier_stage.execute(context)
|
||||
|
||||
assert updated_context.effective_supplier is None
|
||||
assert updated_context.status_flags.get('supplier_error') is True
|
||||
mock_log_warning.assert_any_call(
|
||||
"Asset 'TestAsset' has source rule supplier 'InvalidSupplierB' which is not defined in global suppliers. No supplier set."
|
||||
)
|
||||
mock_log_error.assert_not_called()
|
||||
|
||||
@mock.patch('logging.error')
|
||||
@mock.patch('logging.warning')
|
||||
def test_no_supplier_defined(mock_log_warning, mock_log_error, supplier_stage):
|
||||
context = create_mock_context(
|
||||
asset_rule_supplier_override=None,
|
||||
source_rule_supplier=None,
|
||||
config_suppliers={"SupplierA": mock.MagicMock(spec=Supplier)}
|
||||
)
|
||||
|
||||
updated_context = supplier_stage.execute(context)
|
||||
|
||||
assert updated_context.effective_supplier is None
|
||||
assert updated_context.status_flags.get('supplier_error') is True
|
||||
mock_log_warning.assert_any_call(
|
||||
"No supplier could be determined for asset 'TestAsset'. "
|
||||
"AssetRule override is None and SourceRule supplier is None or empty."
|
||||
)
|
||||
mock_log_error.assert_not_called()
|
||||
|
||||
@mock.patch('logging.error')
|
||||
@mock.patch('logging.warning')
|
||||
def test_empty_config_suppliers_with_asset_override(mock_log_warning, mock_log_error, supplier_stage):
|
||||
context = create_mock_context(
|
||||
asset_rule_supplier_override="SupplierX",
|
||||
config_suppliers={} # Empty global supplier config
|
||||
)
|
||||
|
||||
updated_context = supplier_stage.execute(context)
|
||||
|
||||
assert updated_context.effective_supplier is None
|
||||
assert updated_context.status_flags.get('supplier_error') is True
|
||||
mock_log_warning.assert_any_call(
|
||||
"Asset 'TestAsset' has supplier_override 'SupplierX' which is not defined in global suppliers. No supplier set."
|
||||
)
|
||||
mock_log_error.assert_not_called()
|
||||
|
||||
@mock.patch('logging.error')
|
||||
@mock.patch('logging.warning')
|
||||
def test_empty_config_suppliers_with_source_rule(mock_log_warning, mock_log_error, supplier_stage):
|
||||
context = create_mock_context(
|
||||
source_rule_supplier="SupplierY",
|
||||
config_suppliers={} # Empty global supplier config
|
||||
)
|
||||
|
||||
updated_context = supplier_stage.execute(context)
|
||||
|
||||
assert updated_context.effective_supplier is None
|
||||
assert updated_context.status_flags.get('supplier_error') is True
|
||||
mock_log_warning.assert_any_call(
|
||||
"Asset 'TestAsset' has source rule supplier 'SupplierY' which is not defined in global suppliers. No supplier set."
|
||||
)
|
||||
mock_log_error.assert_not_called()
|
||||
|
||||
@mock.patch('logging.error')
|
||||
@mock.patch('logging.info')
|
||||
def test_asset_rule_override_empty_string(mock_log_info, mock_log_error, supplier_stage):
|
||||
# This scenario should fall back to source_rule.supplier if asset_rule.supplier_override is ""
|
||||
mock_suppliers_config = {"SupplierB": mock.MagicMock(spec=Supplier)}
|
||||
context = create_mock_context(
|
||||
asset_rule_supplier_override="", # Empty string override
|
||||
source_rule_supplier="SupplierB",
|
||||
config_suppliers=mock_suppliers_config
|
||||
)
|
||||
|
||||
updated_context = supplier_stage.execute(context)
|
||||
|
||||
assert updated_context.effective_supplier == "SupplierB" # Falls back to SourceRule
|
||||
assert not updated_context.status_flags.get('supplier_error')
|
||||
mock_log_info.assert_any_call("Effective supplier for asset 'TestAsset' set to 'SupplierB' from source rule.")
|
||||
mock_log_error.assert_not_called()
|
||||
|
||||
@mock.patch('logging.error')
|
||||
@mock.patch('logging.warning')
|
||||
def test_source_rule_supplier_empty_string(mock_log_warning, mock_log_error, supplier_stage):
|
||||
# This scenario should result in an error if asset_rule.supplier_override is None and source_rule.supplier is ""
|
||||
context = create_mock_context(
|
||||
asset_rule_supplier_override=None,
|
||||
source_rule_supplier="", # Empty string source supplier
|
||||
config_suppliers={"SupplierA": mock.MagicMock(spec=Supplier)}
|
||||
)
|
||||
|
||||
updated_context = supplier_stage.execute(context)
|
||||
|
||||
assert updated_context.effective_supplier is None
|
||||
assert updated_context.status_flags.get('supplier_error') is True
|
||||
mock_log_warning.assert_any_call(
|
||||
"No supplier could be determined for asset 'TestAsset'. "
|
||||
"AssetRule override is None and SourceRule supplier is None or empty."
|
||||
)
|
||||
mock_log_error.assert_not_called()
|
||||
383
tests/processing/pipeline/test_orchestrator.py
Normal file
383
tests/processing/pipeline/test_orchestrator.py
Normal file
@ -0,0 +1,383 @@
|
||||
import pytest
|
||||
from unittest import mock
|
||||
from pathlib import Path
|
||||
import uuid
|
||||
import shutil # For checking rmtree
|
||||
import tempfile # For mocking mkdtemp
|
||||
|
||||
from processing.pipeline.orchestrator import PipelineOrchestrator
|
||||
from processing.pipeline.asset_context import AssetProcessingContext
|
||||
from processing.pipeline.stages.base_stage import ProcessingStage # For mocking stages
|
||||
from rule_structure import SourceRule, AssetRule, FileRule
|
||||
from configuration import Configuration, GeneralSettings
|
||||
|
||||
# Mock Stage that modifies context
|
||||
class MockPassThroughStage(ProcessingStage):
|
||||
def __init__(self, stage_name="mock_stage"):
|
||||
self.stage_name = stage_name
|
||||
self.execute_call_count = 0
|
||||
self.contexts_called_with = [] # To store contexts for verification
|
||||
|
||||
def execute(self, context: AssetProcessingContext) -> AssetProcessingContext:
|
||||
self.execute_call_count += 1
|
||||
self.contexts_called_with.append(context)
|
||||
# Optionally, modify context for testing
|
||||
context.asset_metadata[f'{self.stage_name}_executed'] = True
|
||||
if self.stage_name == "skipper_stage": # Example conditional logic
|
||||
context.status_flags['skip_asset'] = True
|
||||
context.status_flags['skip_reason'] = "Skipped by skipper_stage"
|
||||
elif self.stage_name == "error_stage": # Example error-raising stage
|
||||
raise ValueError("Simulated error in error_stage")
|
||||
|
||||
# Simulate status update based on stage execution
|
||||
if not context.status_flags.get('skip_asset') and not context.status_flags.get('asset_failed'):
|
||||
context.asset_metadata['status'] = "Processed" # Default to processed if not skipped/failed
|
||||
return context
|
||||
|
||||
def create_orchestrator_test_config() -> mock.MagicMock:
|
||||
mock_config = mock.MagicMock(spec=Configuration)
|
||||
mock_config.general_settings = mock.MagicMock(spec=GeneralSettings)
|
||||
mock_config.general_settings.temp_dir_override = None # Default, can be overridden in tests
|
||||
# Add other config details if orchestrator or stages depend on them directly
|
||||
return mock_config
|
||||
|
||||
def create_orchestrator_test_asset_rule(name: str, num_file_rules: int = 1) -> mock.MagicMock:
|
||||
asset_rule = mock.MagicMock(spec=AssetRule)
|
||||
asset_rule.name = name
|
||||
asset_rule.id = uuid.uuid4()
|
||||
asset_rule.source_path = Path(f"/fake/source/{name}") # Using Path object
|
||||
asset_rule.file_rules = [mock.MagicMock(spec=FileRule) for _ in range(num_file_rules)]
|
||||
asset_rule.enabled = True
|
||||
asset_rule.map_types = {} # Initialize as dict
|
||||
asset_rule.material_name_scheme = "{asset_name}"
|
||||
asset_rule.texture_name_scheme = "{asset_name}_{map_type}"
|
||||
asset_rule.output_path_scheme = "{source_name}/{asset_name}"
|
||||
# ... other necessary AssetRule fields ...
|
||||
return asset_rule
|
||||
|
||||
def create_orchestrator_test_source_rule(name: str, num_assets: int = 1, asset_names: list = None) -> mock.MagicMock:
|
||||
source_rule = mock.MagicMock(spec=SourceRule)
|
||||
source_rule.name = name
|
||||
source_rule.id = uuid.uuid4()
|
||||
if asset_names:
|
||||
source_rule.assets = [create_orchestrator_test_asset_rule(an) for an in asset_names]
|
||||
else:
|
||||
source_rule.assets = [create_orchestrator_test_asset_rule(f"Asset_{i+1}_in_{name}") for i in range(num_assets)]
|
||||
source_rule.enabled = True
|
||||
source_rule.source_path = Path(f"/fake/source_root/{name}") # Using Path object
|
||||
# ... other necessary SourceRule fields ...
|
||||
return source_rule
|
||||
|
||||
# --- Test Cases for PipelineOrchestrator.process_source_rule() ---
|
||||
|
||||
@mock.patch('shutil.rmtree')
|
||||
@mock.patch('tempfile.mkdtemp')
|
||||
def test_orchestrator_basic_flow_mock_stages(mock_mkdtemp, mock_rmtree):
|
||||
mock_mkdtemp.return_value = "/fake/engine_temp_dir_path" # Path for mkdtemp
|
||||
|
||||
config = create_orchestrator_test_config()
|
||||
stage1 = MockPassThroughStage("stage1")
|
||||
stage2 = MockPassThroughStage("stage2")
|
||||
orchestrator = PipelineOrchestrator(config_obj=config, stages=[stage1, stage2])
|
||||
|
||||
source_rule = create_orchestrator_test_source_rule("MySourceRule", num_assets=2)
|
||||
asset1_name = source_rule.assets[0].name
|
||||
asset2_name = source_rule.assets[1].name
|
||||
|
||||
# Mock asset_metadata to be updated by stages for status check
|
||||
# The MockPassThroughStage already sets a 'status' = "Processed" if not skipped/failed
|
||||
# and adds '{stage_name}_executed' = True to asset_metadata.
|
||||
|
||||
results = orchestrator.process_source_rule(
|
||||
source_rule, Path("/ws"), Path("/out"), False, "inc_val_123", "sha_val_abc"
|
||||
)
|
||||
|
||||
assert stage1.execute_call_count == 2 # Called for each asset
|
||||
assert stage2.execute_call_count == 2 # Called for each asset
|
||||
|
||||
assert asset1_name in results['processed']
|
||||
assert asset2_name in results['processed']
|
||||
assert not results['skipped']
|
||||
assert not results['failed']
|
||||
|
||||
# Verify context modifications by stages
|
||||
for i in range(2): # For each asset
|
||||
# Stage 1 context checks
|
||||
s1_context_asset = stage1.contexts_called_with[i]
|
||||
assert s1_context_asset.asset_metadata.get('stage1_executed') is True
|
||||
assert s1_context_asset.asset_metadata.get('stage2_executed') is None # Stage 2 not yet run for this asset
|
||||
|
||||
# Stage 2 context checks
|
||||
s2_context_asset = stage2.contexts_called_with[i]
|
||||
assert s2_context_asset.asset_metadata.get('stage1_executed') is True # From stage 1
|
||||
assert s2_context_asset.asset_metadata.get('stage2_executed') is True
|
||||
assert s2_context_asset.asset_metadata.get('status') == "Processed"
|
||||
|
||||
mock_mkdtemp.assert_called_once()
|
||||
# The orchestrator creates a subdirectory within the mkdtemp path
|
||||
expected_temp_path = Path(mock_mkdtemp.return_value) / source_rule.id.hex
|
||||
mock_rmtree.assert_called_once_with(expected_temp_path, ignore_errors=True)
|
||||
|
||||
@mock.patch('shutil.rmtree')
|
||||
@mock.patch('tempfile.mkdtemp')
|
||||
def test_orchestrator_asset_skipping_by_stage(mock_mkdtemp, mock_rmtree):
|
||||
mock_mkdtemp.return_value = "/fake/engine_temp_dir_path_skip"
|
||||
|
||||
config = create_orchestrator_test_config()
|
||||
skipper_stage = MockPassThroughStage("skipper_stage") # This stage will set skip_asset = True
|
||||
stage_after_skip = MockPassThroughStage("stage_after_skip")
|
||||
|
||||
orchestrator = PipelineOrchestrator(config_obj=config, stages=[skipper_stage, stage_after_skip])
|
||||
|
||||
source_rule = create_orchestrator_test_source_rule("SkipSourceRule", num_assets=1)
|
||||
asset_to_skip_name = source_rule.assets[0].name
|
||||
|
||||
results = orchestrator.process_source_rule(
|
||||
source_rule, Path("/ws_skip"), Path("/out_skip"), False, "inc_skip", "sha_skip"
|
||||
)
|
||||
|
||||
assert skipper_stage.execute_call_count == 1 # Called for the asset
|
||||
assert stage_after_skip.execute_call_count == 0 # Not called because asset was skipped
|
||||
|
||||
assert asset_to_skip_name in results['skipped']
|
||||
assert not results['processed']
|
||||
assert not results['failed']
|
||||
|
||||
# Verify skip reason in context if needed (MockPassThroughStage stores contexts)
|
||||
skipped_context = skipper_stage.contexts_called_with[0]
|
||||
assert skipped_context.status_flags['skip_asset'] is True
|
||||
assert skipped_context.status_flags['skip_reason'] == "Skipped by skipper_stage"
|
||||
|
||||
mock_mkdtemp.assert_called_once()
|
||||
expected_temp_path = Path(mock_mkdtemp.return_value) / source_rule.id.hex
|
||||
mock_rmtree.assert_called_once_with(expected_temp_path, ignore_errors=True)
|
||||
|
||||
@mock.patch('shutil.rmtree')
|
||||
@mock.patch('tempfile.mkdtemp')
|
||||
def test_orchestrator_no_assets_in_source_rule(mock_mkdtemp, mock_rmtree):
|
||||
mock_mkdtemp.return_value = "/fake/engine_temp_dir_no_assets"
|
||||
|
||||
config = create_orchestrator_test_config()
|
||||
stage1 = MockPassThroughStage("stage1_no_assets")
|
||||
orchestrator = PipelineOrchestrator(config_obj=config, stages=[stage1])
|
||||
|
||||
source_rule = create_orchestrator_test_source_rule("NoAssetSourceRule", num_assets=0)
|
||||
|
||||
results = orchestrator.process_source_rule(
|
||||
source_rule, Path("/ws_no_assets"), Path("/out_no_assets"), False, "inc_no", "sha_no"
|
||||
)
|
||||
|
||||
assert stage1.execute_call_count == 0
|
||||
assert not results['processed']
|
||||
assert not results['skipped']
|
||||
assert not results['failed']
|
||||
|
||||
# mkdtemp should still be called for the source rule processing, even if no assets
|
||||
mock_mkdtemp.assert_called_once()
|
||||
expected_temp_path = Path(mock_mkdtemp.return_value) / source_rule.id.hex
|
||||
mock_rmtree.assert_called_once_with(expected_temp_path, ignore_errors=True)
|
||||
|
||||
|
||||
@mock.patch('shutil.rmtree')
|
||||
@mock.patch('tempfile.mkdtemp')
|
||||
def test_orchestrator_error_during_stage_execution(mock_mkdtemp, mock_rmtree):
|
||||
mock_mkdtemp.return_value = "/fake/engine_temp_dir_error"
|
||||
|
||||
config = create_orchestrator_test_config()
|
||||
error_stage = MockPassThroughStage("error_stage") # This stage will raise an error
|
||||
stage_after_error = MockPassThroughStage("stage_after_error")
|
||||
|
||||
orchestrator = PipelineOrchestrator(config_obj=config, stages=[error_stage, stage_after_error])
|
||||
|
||||
# Test with two assets, one fails, one processes (if orchestrator continues)
|
||||
# The current orchestrator's process_asset is per asset, so an error in one
|
||||
# should not stop processing of other assets in the same source_rule.
|
||||
source_rule = create_orchestrator_test_source_rule("ErrorSourceRule", asset_names=["AssetFails", "AssetSucceeds"])
|
||||
asset_fails_name = source_rule.assets[0].name
|
||||
asset_succeeds_name = source_rule.assets[1].name
|
||||
|
||||
# Make only the first asset's processing trigger the error
|
||||
original_execute = error_stage.execute
|
||||
def error_execute_side_effect(context: AssetProcessingContext):
|
||||
if context.asset_rule.name == asset_fails_name:
|
||||
# The MockPassThroughStage is already configured to raise ValueError for "error_stage"
|
||||
# but we need to ensure it's only for the first asset.
|
||||
# We can achieve this by modifying the stage_name temporarily or by checking asset_rule.name
|
||||
# For simplicity, let's assume the mock stage's error logic is fine,
|
||||
# and we just need to check the outcome.
|
||||
# The error_stage will raise ValueError("Simulated error in error_stage")
|
||||
# The orchestrator's _process_single_asset catches generic Exception.
|
||||
return original_execute(context) # This will call the erroring logic
|
||||
else:
|
||||
# For the second asset, make it pass through without error
|
||||
context.asset_metadata[f'{error_stage.stage_name}_executed'] = True
|
||||
context.asset_metadata['status'] = "Processed"
|
||||
return context
|
||||
|
||||
error_stage.execute = mock.MagicMock(side_effect=error_execute_side_effect)
|
||||
# stage_after_error should still be called for the successful asset
|
||||
|
||||
results = orchestrator.process_source_rule(
|
||||
source_rule, Path("/ws_error"), Path("/out_error"), False, "inc_err", "sha_err"
|
||||
)
|
||||
|
||||
assert error_stage.execute.call_count == 2 # Called for both assets
|
||||
# stage_after_error is only called for the asset that didn't fail in error_stage
|
||||
assert stage_after_error.execute_call_count == 1
|
||||
|
||||
assert asset_fails_name in results['failed']
|
||||
assert asset_succeeds_name in results['processed']
|
||||
assert not results['skipped']
|
||||
|
||||
# Verify the context of the failed asset
|
||||
failed_context = None
|
||||
for ctx in error_stage.contexts_called_with:
|
||||
if ctx.asset_rule.name == asset_fails_name:
|
||||
failed_context = ctx
|
||||
break
|
||||
assert failed_context is not None
|
||||
assert failed_context.status_flags['asset_failed'] is True
|
||||
assert "Simulated error in error_stage" in failed_context.status_flags['failure_reason']
|
||||
|
||||
# Verify the context of the successful asset after stage_after_error
|
||||
successful_context_after_s2 = None
|
||||
for ctx in stage_after_error.contexts_called_with:
|
||||
if ctx.asset_rule.name == asset_succeeds_name:
|
||||
successful_context_after_s2 = ctx
|
||||
break
|
||||
assert successful_context_after_s2 is not None
|
||||
assert successful_context_after_s2.asset_metadata.get('error_stage_executed') is True # from the non-erroring path
|
||||
assert successful_context_after_s2.asset_metadata.get('stage_after_error_executed') is True
|
||||
assert successful_context_after_s2.asset_metadata.get('status') == "Processed"
|
||||
|
||||
|
||||
mock_mkdtemp.assert_called_once()
|
||||
expected_temp_path = Path(mock_mkdtemp.return_value) / source_rule.id.hex
|
||||
mock_rmtree.assert_called_once_with(expected_temp_path, ignore_errors=True)
|
||||
|
||||
|
||||
@mock.patch('shutil.rmtree')
|
||||
@mock.patch('tempfile.mkdtemp')
|
||||
def test_orchestrator_asset_processing_context_initialization(mock_mkdtemp, mock_rmtree):
|
||||
mock_engine_temp_dir = "/fake/engine_temp_dir_context_init"
|
||||
mock_mkdtemp.return_value = mock_engine_temp_dir
|
||||
|
||||
config = create_orchestrator_test_config()
|
||||
mock_stage = MockPassThroughStage("context_check_stage")
|
||||
orchestrator = PipelineOrchestrator(config_obj=config, stages=[mock_stage])
|
||||
|
||||
source_rule = create_orchestrator_test_source_rule("ContextSourceRule", num_assets=1)
|
||||
asset_rule = source_rule.assets[0]
|
||||
|
||||
workspace_path = Path("/ws_context")
|
||||
output_base_path = Path("/out_context")
|
||||
incrementing_value = "inc_context_123"
|
||||
sha5_value = "sha_context_abc"
|
||||
|
||||
orchestrator.process_source_rule(
|
||||
source_rule, workspace_path, output_base_path, False, incrementing_value, sha5_value
|
||||
)
|
||||
|
||||
assert mock_stage.execute_call_count == 1
|
||||
|
||||
# Retrieve the context passed to the mock stage
|
||||
captured_context = mock_stage.contexts_called_with[0]
|
||||
|
||||
assert captured_context.source_rule == source_rule
|
||||
assert captured_context.asset_rule == asset_rule
|
||||
assert captured_context.workspace_path == workspace_path
|
||||
|
||||
# engine_temp_dir for the asset is a sub-directory of the source_rule's temp dir
|
||||
# which itself is a sub-directory of the main engine_temp_dir from mkdtemp
|
||||
expected_source_rule_temp_dir = Path(mock_engine_temp_dir) / source_rule.id.hex
|
||||
expected_asset_temp_dir = expected_source_rule_temp_dir / asset_rule.id.hex
|
||||
assert captured_context.engine_temp_dir == expected_asset_temp_dir
|
||||
|
||||
assert captured_context.output_base_path == output_base_path
|
||||
assert captured_context.config_obj == config
|
||||
assert captured_context.incrementing_value == incrementing_value
|
||||
assert captured_context.sha5_value == sha5_value
|
||||
|
||||
# Check initial state of other context fields
|
||||
assert captured_context.asset_metadata == {} # Should be empty initially for an asset
|
||||
assert captured_context.status_flags == {} # Should be empty initially
|
||||
assert captured_context.shared_data == {} # Should be empty initially
|
||||
assert captured_context.current_files == [] # Should be empty initially
|
||||
|
||||
mock_mkdtemp.assert_called_once()
|
||||
mock_rmtree.assert_called_once_with(expected_source_rule_temp_dir, ignore_errors=True)
|
||||
|
||||
@mock.patch('shutil.rmtree')
|
||||
@mock.patch('tempfile.mkdtemp')
|
||||
def test_orchestrator_temp_dir_override_from_config(mock_mkdtemp, mock_rmtree):
|
||||
# This test verifies that if config.general_settings.temp_dir_override is set,
|
||||
# mkdtemp is NOT called, and the override path is used and cleaned up.
|
||||
|
||||
config = create_orchestrator_test_config()
|
||||
override_temp_path_str = "/override/temp/path"
|
||||
config.general_settings.temp_dir_override = override_temp_path_str
|
||||
|
||||
stage1 = MockPassThroughStage("stage_temp_override")
|
||||
orchestrator = PipelineOrchestrator(config_obj=config, stages=[stage1])
|
||||
|
||||
source_rule = create_orchestrator_test_source_rule("TempOverrideRule", num_assets=1)
|
||||
asset_rule = source_rule.assets[0]
|
||||
|
||||
results = orchestrator.process_source_rule(
|
||||
source_rule, Path("/ws_override"), Path("/out_override"), False, "inc_override", "sha_override"
|
||||
)
|
||||
|
||||
assert stage1.execute_call_count == 1
|
||||
assert asset_rule.name in results['processed']
|
||||
|
||||
mock_mkdtemp.assert_not_called() # mkdtemp should not be called due to override
|
||||
|
||||
# The orchestrator should create its source-rule specific subdir within the override
|
||||
expected_source_rule_temp_dir_in_override = Path(override_temp_path_str) / source_rule.id.hex
|
||||
|
||||
# Verify the context passed to the stage uses the overridden path structure
|
||||
captured_context = stage1.contexts_called_with[0]
|
||||
expected_asset_temp_dir_in_override = expected_source_rule_temp_dir_in_override / asset_rule.id.hex
|
||||
assert captured_context.engine_temp_dir == expected_asset_temp_dir_in_override
|
||||
|
||||
# rmtree should be called on the source_rule's directory within the override path
|
||||
mock_rmtree.assert_called_once_with(expected_source_rule_temp_dir_in_override, ignore_errors=True)
|
||||
|
||||
@mock.patch('shutil.rmtree')
|
||||
@mock.patch('tempfile.mkdtemp')
|
||||
def test_orchestrator_disabled_asset_rule_is_skipped(mock_mkdtemp, mock_rmtree):
|
||||
mock_mkdtemp.return_value = "/fake/engine_temp_dir_disabled_asset"
|
||||
|
||||
config = create_orchestrator_test_config()
|
||||
stage1 = MockPassThroughStage("stage_disabled_check")
|
||||
orchestrator = PipelineOrchestrator(config_obj=config, stages=[stage1])
|
||||
|
||||
source_rule = create_orchestrator_test_source_rule("DisabledAssetSourceRule", asset_names=["EnabledAsset", "DisabledAsset"])
|
||||
enabled_asset = source_rule.assets[0]
|
||||
disabled_asset = source_rule.assets[1]
|
||||
disabled_asset.enabled = False # Disable this asset rule
|
||||
|
||||
results = orchestrator.process_source_rule(
|
||||
source_rule, Path("/ws_disabled"), Path("/out_disabled"), False, "inc_dis", "sha_dis"
|
||||
)
|
||||
|
||||
assert stage1.execute_call_count == 1 # Only called for the enabled asset
|
||||
|
||||
assert enabled_asset.name in results['processed']
|
||||
assert disabled_asset.name in results['skipped']
|
||||
assert not results['failed']
|
||||
|
||||
# Verify context for the processed asset
|
||||
assert stage1.contexts_called_with[0].asset_rule.name == enabled_asset.name
|
||||
|
||||
# Verify skip reason for the disabled asset (this is set by the orchestrator itself)
|
||||
# The orchestrator's _process_single_asset checks asset_rule.enabled
|
||||
# We need to inspect the results dictionary for the skip reason if it's stored there,
|
||||
# or infer it. The current structure of `results` doesn't store detailed skip reasons directly,
|
||||
# but the test ensures it's in the 'skipped' list.
|
||||
# For a more detailed check, one might need to adjust how results are reported or mock deeper.
|
||||
# For now, confirming it's in 'skipped' and stage1 wasn't called for it is sufficient.
|
||||
|
||||
mock_mkdtemp.assert_called_once()
|
||||
expected_temp_path = Path(mock_mkdtemp.return_value) / source_rule.id.hex
|
||||
mock_rmtree.assert_called_once_with(expected_temp_path, ignore_errors=True)
|
||||
504
tests/processing/utils/test_image_processing_utils.py
Normal file
504
tests/processing/utils/test_image_processing_utils.py
Normal file
@ -0,0 +1,504 @@
|
||||
import pytest
|
||||
from unittest import mock
|
||||
import numpy as np
|
||||
from pathlib import Path
|
||||
import sys
|
||||
|
||||
# Attempt to import the module under test
|
||||
# This assumes that the 'tests' directory is at the same level as the 'processing' directory,
|
||||
# and pytest handles the PYTHONPATH correctly.
|
||||
try:
|
||||
from processing.utils import image_processing_utils as ipu
|
||||
import cv2 # Import cv2 here if it's used for constants like cv2.COLOR_BGR2RGB
|
||||
except ImportError:
|
||||
# Fallback for environments where PYTHONPATH might not be set up as expected by pytest initially
|
||||
# This adds the project root to sys.path to find the 'processing' module
|
||||
# Adjust the number of Path.parent calls if your test structure is deeper or shallower
|
||||
project_root = Path(__file__).parent.parent.parent.parent
|
||||
sys.path.insert(0, str(project_root))
|
||||
from processing.utils import image_processing_utils as ipu
|
||||
import cv2 # Import cv2 here as well
|
||||
|
||||
# If cv2 is imported directly in image_processing_utils, you might need to mock it globally for some tests
|
||||
# For example, at the top of the test file:
|
||||
# sys.modules['cv2'] = mock.MagicMock() # Basic global mock if needed
|
||||
# We will use more targeted mocks with @mock.patch where cv2 is used.
|
||||
|
||||
# --- Tests for Mathematical Helpers ---
|
||||
|
||||
def test_is_power_of_two():
|
||||
assert ipu.is_power_of_two(1) is True
|
||||
assert ipu.is_power_of_two(2) is True
|
||||
assert ipu.is_power_of_two(4) is True
|
||||
assert ipu.is_power_of_two(16) is True
|
||||
assert ipu.is_power_of_two(1024) is True
|
||||
assert ipu.is_power_of_two(0) is False
|
||||
assert ipu.is_power_of_two(-2) is False
|
||||
assert ipu.is_power_of_two(3) is False
|
||||
assert ipu.is_power_of_two(100) is False
|
||||
|
||||
def test_get_nearest_pot():
|
||||
assert ipu.get_nearest_pot(1) == 1
|
||||
assert ipu.get_nearest_pot(2) == 2
|
||||
# Based on current implementation:
|
||||
# For 3: lower=2, upper=4. (3-2)=1, (4-3)=1. Else branch returns upper_pot. So 4.
|
||||
assert ipu.get_nearest_pot(3) == 4
|
||||
assert ipu.get_nearest_pot(50) == 64 # (50-32)=18, (64-50)=14 -> upper
|
||||
assert ipu.get_nearest_pot(100) == 128 # (100-64)=36, (128-100)=28 -> upper
|
||||
assert ipu.get_nearest_pot(256) == 256
|
||||
assert ipu.get_nearest_pot(0) == 1
|
||||
assert ipu.get_nearest_pot(-10) == 1
|
||||
# For 700: value.bit_length() = 10. lower_pot = 1<<(10-1) = 512. upper_pot = 1<<10 = 1024.
|
||||
# (700-512) = 188. (1024-700) = 324. (188 < 324) is True. Returns lower_pot. So 512.
|
||||
assert ipu.get_nearest_pot(700) == 512
|
||||
assert ipu.get_nearest_pot(6) == 8 # (6-4)=2, (8-6)=2. Returns upper.
|
||||
assert ipu.get_nearest_pot(5) == 4 # (5-4)=1, (8-5)=3. Returns lower.
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"orig_w, orig_h, target_w, target_h, resize_mode, ensure_pot, allow_upscale, target_max_dim, expected_w, expected_h",
|
||||
[
|
||||
# FIT mode
|
||||
(1000, 800, 500, None, "fit", False, False, None, 500, 400), # Fit width
|
||||
(1000, 800, None, 400, "fit", False, False, None, 500, 400), # Fit height
|
||||
(1000, 800, 500, 500, "fit", False, False, None, 500, 400), # Fit to box (width constrained)
|
||||
(800, 1000, 500, 500, "fit", False, False, None, 400, 500), # Fit to box (height constrained)
|
||||
(100, 80, 200, None, "fit", False, False, None, 100, 80), # Fit width, no upscale
|
||||
(100, 80, 200, None, "fit", False, True, None, 200, 160), # Fit width, allow upscale
|
||||
(100, 80, 128, None, "fit", True, False, None, 128, 64), # Re-evaluated
|
||||
(100, 80, 128, None, "fit", True, True, None, 128, 128), # Fit width, ensure_pot, allow upscale (128, 102 -> pot 128, 128)
|
||||
|
||||
# STRETCH mode
|
||||
(1000, 800, 500, 400, "stretch", False, False, None, 500, 400),
|
||||
(100, 80, 200, 160, "stretch", False, True, None, 200, 160), # Stretch, allow upscale
|
||||
(100, 80, 200, 160, "stretch", False, False, None, 100, 80), # Stretch, no upscale
|
||||
(100, 80, 128, 128, "stretch", True, True, None, 128, 128), # Stretch, ensure_pot, allow upscale
|
||||
(100, 80, 70, 70, "stretch", True, False, None, 64, 64), # Stretch, ensure_pot, no upscale (70,70 -> pot 64,64)
|
||||
|
||||
# MAX_DIM_POT mode
|
||||
(1000, 800, None, None, "max_dim_pot", True, False, 512, 512, 512),
|
||||
(800, 1000, None, None, "max_dim_pot", True, False, 512, 512, 512),
|
||||
(1920, 1080, None, None, "max_dim_pot", True, False, 1024, 1024, 512),
|
||||
(100, 100, None, None, "max_dim_pot", True, False, 60, 64, 64),
|
||||
# Edge cases for calculate_target_dimensions
|
||||
(0, 0, 512, 512, "fit", False, False, None, 512, 512),
|
||||
(10, 10, 512, 512, "fit", True, False, None, 8, 8),
|
||||
(100, 100, 150, 150, "fit", True, False, None, 128, 128),
|
||||
]
|
||||
)
|
||||
def test_calculate_target_dimensions(orig_w, orig_h, target_w, target_h, resize_mode, ensure_pot, allow_upscale, target_max_dim, expected_w, expected_h):
|
||||
if resize_mode == "max_dim_pot" and target_max_dim is None:
|
||||
with pytest.raises(ValueError, match="target_max_dim_for_pot_mode must be provided"):
|
||||
ipu.calculate_target_dimensions(orig_w, orig_h, target_width=target_w, target_height=target_h,
|
||||
resize_mode=resize_mode, ensure_pot=ensure_pot, allow_upscale=allow_upscale,
|
||||
target_max_dim_for_pot_mode=target_max_dim)
|
||||
elif (resize_mode == "fit" and target_w is None and target_h is None) or \
|
||||
(resize_mode == "stretch" and (target_w is None or target_h is None)):
|
||||
with pytest.raises(ValueError):
|
||||
ipu.calculate_target_dimensions(orig_w, orig_h, target_width=target_w, target_height=target_h,
|
||||
resize_mode=resize_mode, ensure_pot=ensure_pot, allow_upscale=allow_upscale,
|
||||
target_max_dim_for_pot_mode=target_max_dim)
|
||||
else:
|
||||
actual_w, actual_h = ipu.calculate_target_dimensions(
|
||||
orig_w, orig_h, target_width=target_w, target_height=target_h,
|
||||
resize_mode=resize_mode, ensure_pot=ensure_pot, allow_upscale=allow_upscale,
|
||||
target_max_dim_for_pot_mode=target_max_dim
|
||||
)
|
||||
assert (actual_w, actual_h) == (expected_w, expected_h), \
|
||||
f"Input: ({orig_w},{orig_h}), T=({target_w},{target_h}), M={resize_mode}, POT={ensure_pot}, UPSC={allow_upscale}, TMAX={target_max_dim}"
|
||||
|
||||
|
||||
def test_calculate_target_dimensions_invalid_mode():
|
||||
with pytest.raises(ValueError, match="Unsupported resize_mode"):
|
||||
ipu.calculate_target_dimensions(100, 100, 50, 50, resize_mode="invalid_mode")
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"ow, oh, rw, rh, expected_str",
|
||||
[
|
||||
(100, 100, 100, 100, "EVEN"),
|
||||
(100, 100, 200, 200, "EVEN"),
|
||||
(200, 200, 100, 100, "EVEN"),
|
||||
(100, 100, 150, 100, "X15Y1"),
|
||||
(100, 100, 50, 100, "X05Y1"),
|
||||
(100, 100, 100, 150, "X1Y15"),
|
||||
(100, 100, 100, 50, "X1Y05"),
|
||||
(100, 50, 150, 75, "EVEN"),
|
||||
(100, 50, 150, 50, "X15Y1"),
|
||||
(100, 50, 100, 75, "X1Y15"),
|
||||
(100, 50, 120, 60, "EVEN"),
|
||||
(100, 50, 133, 66, "EVEN"),
|
||||
(100, 100, 133, 100, "X133Y1"),
|
||||
(100, 100, 100, 133, "X1Y133"),
|
||||
(100, 100, 133, 133, "EVEN"),
|
||||
(100, 100, 67, 100, "X067Y1"),
|
||||
(100, 100, 100, 67, "X1Y067"),
|
||||
(100, 100, 67, 67, "EVEN"),
|
||||
(1920, 1080, 1024, 576, "EVEN"),
|
||||
(1920, 1080, 1024, 512, "X112Y1"),
|
||||
(0, 100, 50, 50, "InvalidInput"),
|
||||
(100, 0, 50, 50, "InvalidInput"),
|
||||
(100, 100, 0, 50, "InvalidResize"),
|
||||
(100, 100, 50, 0, "InvalidResize"),
|
||||
]
|
||||
)
|
||||
def test_normalize_aspect_ratio_change(ow, oh, rw, rh, expected_str):
|
||||
assert ipu.normalize_aspect_ratio_change(ow, oh, rw, rh) == expected_str
|
||||
|
||||
# --- Tests for Image Manipulation ---
|
||||
|
||||
@mock.patch('cv2.imread')
|
||||
def test_load_image_success_str_path(mock_cv2_imread):
|
||||
mock_img_data = np.array([[[1, 2, 3]]], dtype=np.uint8)
|
||||
mock_cv2_imread.return_value = mock_img_data
|
||||
|
||||
result = ipu.load_image("dummy/path.png")
|
||||
|
||||
mock_cv2_imread.assert_called_once_with("dummy/path.png", cv2.IMREAD_UNCHANGED)
|
||||
assert np.array_equal(result, mock_img_data)
|
||||
|
||||
@mock.patch('cv2.imread')
|
||||
def test_load_image_success_path_obj(mock_cv2_imread):
|
||||
mock_img_data = np.array([[[1, 2, 3]]], dtype=np.uint8)
|
||||
mock_cv2_imread.return_value = mock_img_data
|
||||
dummy_path = Path("dummy/path.png")
|
||||
|
||||
result = ipu.load_image(dummy_path)
|
||||
|
||||
mock_cv2_imread.assert_called_once_with(str(dummy_path), cv2.IMREAD_UNCHANGED)
|
||||
assert np.array_equal(result, mock_img_data)
|
||||
|
||||
@mock.patch('cv2.imread')
|
||||
def test_load_image_failure(mock_cv2_imread):
|
||||
mock_cv2_imread.return_value = None
|
||||
|
||||
result = ipu.load_image("dummy/path.png")
|
||||
|
||||
mock_cv2_imread.assert_called_once_with("dummy/path.png", cv2.IMREAD_UNCHANGED)
|
||||
assert result is None
|
||||
|
||||
@mock.patch('cv2.imread', side_effect=Exception("CV2 Read Error"))
|
||||
def test_load_image_exception(mock_cv2_imread):
|
||||
result = ipu.load_image("dummy/path.png")
|
||||
mock_cv2_imread.assert_called_once_with("dummy/path.png", cv2.IMREAD_UNCHANGED)
|
||||
assert result is None
|
||||
|
||||
|
||||
@mock.patch('cv2.cvtColor')
|
||||
def test_convert_bgr_to_rgb_3_channel(mock_cv2_cvtcolor):
|
||||
bgr_image = np.random.randint(0, 255, (10, 10, 3), dtype=np.uint8)
|
||||
rgb_image_mock = np.random.randint(0, 255, (10, 10, 3), dtype=np.uint8)
|
||||
mock_cv2_cvtcolor.return_value = rgb_image_mock
|
||||
|
||||
result = ipu.convert_bgr_to_rgb(bgr_image)
|
||||
|
||||
mock_cv2_cvtcolor.assert_called_once_with(bgr_image, cv2.COLOR_BGR2RGB)
|
||||
assert np.array_equal(result, rgb_image_mock)
|
||||
|
||||
@mock.patch('cv2.cvtColor')
|
||||
def test_convert_bgr_to_rgb_4_channel_bgra(mock_cv2_cvtcolor):
|
||||
bgra_image = np.random.randint(0, 255, (10, 10, 4), dtype=np.uint8)
|
||||
rgb_image_mock = np.random.randint(0, 255, (10, 10, 3), dtype=np.uint8) # cvtColor BGRA2RGB drops alpha
|
||||
mock_cv2_cvtcolor.return_value = rgb_image_mock # Mocking the output of BGRA2RGB
|
||||
|
||||
result = ipu.convert_bgr_to_rgb(bgra_image)
|
||||
|
||||
mock_cv2_cvtcolor.assert_called_once_with(bgra_image, cv2.COLOR_BGRA2RGB)
|
||||
assert np.array_equal(result, rgb_image_mock)
|
||||
|
||||
|
||||
def test_convert_bgr_to_rgb_none_input():
|
||||
assert ipu.convert_bgr_to_rgb(None) is None
|
||||
|
||||
def test_convert_bgr_to_rgb_grayscale_input():
|
||||
gray_image = np.random.randint(0, 255, (10, 10), dtype=np.uint8)
|
||||
result = ipu.convert_bgr_to_rgb(gray_image)
|
||||
assert np.array_equal(result, gray_image) # Should return as is
|
||||
|
||||
@mock.patch('cv2.cvtColor')
|
||||
def test_convert_rgb_to_bgr_3_channel(mock_cv2_cvtcolor):
|
||||
rgb_image = np.random.randint(0, 255, (10, 10, 3), dtype=np.uint8)
|
||||
bgr_image_mock = np.random.randint(0, 255, (10, 10, 3), dtype=np.uint8)
|
||||
mock_cv2_cvtcolor.return_value = bgr_image_mock
|
||||
|
||||
result = ipu.convert_rgb_to_bgr(rgb_image)
|
||||
|
||||
mock_cv2_cvtcolor.assert_called_once_with(rgb_image, cv2.COLOR_RGB2BGR)
|
||||
assert np.array_equal(result, bgr_image_mock)
|
||||
|
||||
def test_convert_rgb_to_bgr_none_input():
|
||||
assert ipu.convert_rgb_to_bgr(None) is None
|
||||
|
||||
def test_convert_rgb_to_bgr_grayscale_input():
|
||||
gray_image = np.random.randint(0, 255, (10, 10), dtype=np.uint8)
|
||||
result = ipu.convert_rgb_to_bgr(gray_image)
|
||||
assert np.array_equal(result, gray_image) # Should return as is
|
||||
|
||||
def test_convert_rgb_to_bgr_4_channel_input():
|
||||
rgba_image = np.random.randint(0, 255, (10, 10, 4), dtype=np.uint8)
|
||||
result = ipu.convert_rgb_to_bgr(rgba_image)
|
||||
assert np.array_equal(result, rgba_image) # Should return as is
|
||||
|
||||
|
||||
@mock.patch('cv2.resize')
|
||||
def test_resize_image_downscale(mock_cv2_resize):
|
||||
original_image = np.random.randint(0, 255, (100, 100, 3), dtype=np.uint8)
|
||||
resized_image_mock = np.random.randint(0, 255, (50, 50, 3), dtype=np.uint8)
|
||||
mock_cv2_resize.return_value = resized_image_mock
|
||||
target_w, target_h = 50, 50
|
||||
|
||||
result = ipu.resize_image(original_image, target_w, target_h)
|
||||
|
||||
mock_cv2_resize.assert_called_once_with(original_image, (target_w, target_h), interpolation=cv2.INTER_LANCZOS4)
|
||||
assert np.array_equal(result, resized_image_mock)
|
||||
|
||||
@mock.patch('cv2.resize')
|
||||
def test_resize_image_upscale(mock_cv2_resize):
|
||||
original_image = np.random.randint(0, 255, (50, 50, 3), dtype=np.uint8)
|
||||
resized_image_mock = np.random.randint(0, 255, (100, 100, 3), dtype=np.uint8)
|
||||
mock_cv2_resize.return_value = resized_image_mock
|
||||
target_w, target_h = 100, 100
|
||||
|
||||
result = ipu.resize_image(original_image, target_w, target_h)
|
||||
|
||||
mock_cv2_resize.assert_called_once_with(original_image, (target_w, target_h), interpolation=cv2.INTER_CUBIC)
|
||||
assert np.array_equal(result, resized_image_mock)
|
||||
|
||||
@mock.patch('cv2.resize')
|
||||
def test_resize_image_custom_interpolation(mock_cv2_resize):
|
||||
original_image = np.random.randint(0, 255, (100, 100, 3), dtype=np.uint8)
|
||||
resized_image_mock = np.random.randint(0, 255, (50, 50, 3), dtype=np.uint8)
|
||||
mock_cv2_resize.return_value = resized_image_mock
|
||||
target_w, target_h = 50, 50
|
||||
|
||||
result = ipu.resize_image(original_image, target_w, target_h, interpolation=cv2.INTER_NEAREST)
|
||||
|
||||
mock_cv2_resize.assert_called_once_with(original_image, (target_w, target_h), interpolation=cv2.INTER_NEAREST)
|
||||
assert np.array_equal(result, resized_image_mock)
|
||||
|
||||
def test_resize_image_none_input():
|
||||
with pytest.raises(ValueError, match="Cannot resize a None image."):
|
||||
ipu.resize_image(None, 50, 50)
|
||||
|
||||
@pytest.mark.parametrize("w, h", [(0, 50), (50, 0), (-1, 50)])
|
||||
def test_resize_image_invalid_dims(w, h):
|
||||
original_image = np.random.randint(0, 255, (100, 100, 3), dtype=np.uint8)
|
||||
with pytest.raises(ValueError, match="Target width and height must be positive."):
|
||||
ipu.resize_image(original_image, w, h)
|
||||
|
||||
|
||||
@mock.patch('cv2.imwrite')
|
||||
@mock.patch('pathlib.Path.mkdir') # Mock mkdir to avoid actual directory creation
|
||||
def test_save_image_success(mock_mkdir, mock_cv2_imwrite):
|
||||
mock_cv2_imwrite.return_value = True
|
||||
img_data = np.zeros((10,10,3), dtype=np.uint8) # RGB
|
||||
save_path = "output/test.png"
|
||||
|
||||
# ipu.save_image converts RGB to BGR by default for non-EXR
|
||||
# So we expect convert_rgb_to_bgr to be called internally,
|
||||
# and cv2.imwrite to receive BGR data.
|
||||
# We can mock convert_rgb_to_bgr if we want to be very specific,
|
||||
# or trust its own unit tests and check the data passed to imwrite.
|
||||
# For simplicity, let's assume convert_rgb_to_bgr works and imwrite gets BGR.
|
||||
# The function copies data, so we can check the mock call.
|
||||
|
||||
success = ipu.save_image(save_path, img_data, convert_to_bgr_before_save=True)
|
||||
|
||||
assert success is True
|
||||
mock_mkdir.assert_called_once_with(parents=True, exist_ok=True)
|
||||
|
||||
# Check that imwrite was called. The first arg to assert_called_once_with is the path.
|
||||
# The second arg is the image data. We need to compare it carefully.
|
||||
# Since convert_rgb_to_bgr is called internally, the data passed to imwrite will be BGR.
|
||||
# Let's create expected BGR data.
|
||||
expected_bgr_data = cv2.cvtColor(img_data, cv2.COLOR_RGB2BGR)
|
||||
|
||||
args, kwargs = mock_cv2_imwrite.call_args
|
||||
assert args[0] == str(Path(save_path))
|
||||
assert np.array_equal(args[1], expected_bgr_data)
|
||||
|
||||
|
||||
@mock.patch('cv2.imwrite')
|
||||
@mock.patch('pathlib.Path.mkdir')
|
||||
def test_save_image_success_exr_no_bgr_conversion(mock_mkdir, mock_cv2_imwrite):
|
||||
mock_cv2_imwrite.return_value = True
|
||||
img_data_rgb_float = np.random.rand(10,10,3).astype(np.float32) # RGB float for EXR
|
||||
save_path = "output/test.exr"
|
||||
|
||||
success = ipu.save_image(save_path, img_data_rgb_float, output_format="exr", convert_to_bgr_before_save=False)
|
||||
|
||||
assert success is True
|
||||
mock_mkdir.assert_called_once_with(parents=True, exist_ok=True)
|
||||
args, kwargs = mock_cv2_imwrite.call_args
|
||||
assert args[0] == str(Path(save_path))
|
||||
assert np.array_equal(args[1], img_data_rgb_float) # Should be original RGB data
|
||||
|
||||
@mock.patch('cv2.imwrite')
|
||||
@mock.patch('pathlib.Path.mkdir')
|
||||
def test_save_image_success_explicit_bgr_false_png(mock_mkdir, mock_cv2_imwrite):
|
||||
mock_cv2_imwrite.return_value = True
|
||||
img_data_rgb = np.zeros((10,10,3), dtype=np.uint8) # RGB
|
||||
save_path = "output/test.png"
|
||||
|
||||
# If convert_to_bgr_before_save is False, it should save RGB as is.
|
||||
# However, OpenCV's imwrite for PNG might still expect BGR.
|
||||
# The function's docstring says: "If True and image is 3-channel, converts RGB to BGR."
|
||||
# So if False, it passes the data as is.
|
||||
success = ipu.save_image(save_path, img_data_rgb, convert_to_bgr_before_save=False)
|
||||
|
||||
assert success is True
|
||||
mock_mkdir.assert_called_once_with(parents=True, exist_ok=True)
|
||||
args, kwargs = mock_cv2_imwrite.call_args
|
||||
assert args[0] == str(Path(save_path))
|
||||
assert np.array_equal(args[1], img_data_rgb)
|
||||
|
||||
|
||||
@mock.patch('cv2.imwrite')
|
||||
@mock.patch('pathlib.Path.mkdir')
|
||||
def test_save_image_failure(mock_mkdir, mock_cv2_imwrite):
|
||||
mock_cv2_imwrite.return_value = False
|
||||
img_data = np.zeros((10,10,3), dtype=np.uint8)
|
||||
save_path = "output/fail.png"
|
||||
|
||||
success = ipu.save_image(save_path, img_data)
|
||||
|
||||
assert success is False
|
||||
mock_mkdir.assert_called_once_with(parents=True, exist_ok=True)
|
||||
mock_cv2_imwrite.assert_called_once() # Check it was called
|
||||
|
||||
def test_save_image_none_data():
|
||||
assert ipu.save_image("output/none.png", None) is False
|
||||
|
||||
@mock.patch('cv2.imwrite', side_effect=Exception("CV2 Write Error"))
|
||||
@mock.patch('pathlib.Path.mkdir')
|
||||
def test_save_image_exception(mock_mkdir, mock_cv2_imwrite_exception):
|
||||
img_data = np.zeros((10,10,3), dtype=np.uint8)
|
||||
save_path = "output/exception.png"
|
||||
|
||||
success = ipu.save_image(save_path, img_data)
|
||||
|
||||
assert success is False
|
||||
mock_mkdir.assert_called_once_with(parents=True, exist_ok=True)
|
||||
mock_cv2_imwrite_exception.assert_called_once()
|
||||
|
||||
# Test data type conversions in save_image
|
||||
@pytest.mark.parametrize(
|
||||
"input_dtype, input_data_producer, output_dtype_target, expected_conversion_dtype, check_scaling",
|
||||
[
|
||||
(np.uint16, lambda: (np.random.randint(0, 65535, (10,10,3), dtype=np.uint16)), np.uint8, np.uint8, True),
|
||||
(np.float32, lambda: np.random.rand(10,10,3).astype(np.float32), np.uint8, np.uint8, True),
|
||||
(np.uint8, lambda: (np.random.randint(0, 255, (10,10,3), dtype=np.uint8)), np.uint16, np.uint16, True),
|
||||
(np.float32, lambda: np.random.rand(10,10,3).astype(np.float32), np.uint16, np.uint16, True),
|
||||
(np.uint8, lambda: (np.random.randint(0, 255, (10,10,3), dtype=np.uint8)), np.float16, np.float16, True),
|
||||
(np.uint16, lambda: (np.random.randint(0, 65535, (10,10,3), dtype=np.uint16)), np.float32, np.float32, True),
|
||||
]
|
||||
)
|
||||
@mock.patch('cv2.imwrite')
|
||||
@mock.patch('pathlib.Path.mkdir')
|
||||
def test_save_image_dtype_conversion(mock_mkdir, mock_cv2_imwrite, input_dtype, input_data_producer, output_dtype_target, expected_conversion_dtype, check_scaling):
|
||||
mock_cv2_imwrite.return_value = True
|
||||
img_data = input_data_producer()
|
||||
original_img_data_copy = img_data.copy() # For checking scaling if needed
|
||||
|
||||
ipu.save_image("output/dtype_test.png", img_data, output_dtype_target=output_dtype_target)
|
||||
|
||||
mock_cv2_imwrite.assert_called_once()
|
||||
saved_img_data = mock_cv2_imwrite.call_args[0][1] # Get the image data passed to imwrite
|
||||
|
||||
assert saved_img_data.dtype == expected_conversion_dtype
|
||||
|
||||
if check_scaling:
|
||||
# This is a basic check. More precise checks would require known input/output values.
|
||||
if output_dtype_target == np.uint8:
|
||||
if input_dtype == np.uint16:
|
||||
expected_scaled_data = (original_img_data_copy.astype(np.float32) / 65535.0 * 255.0).astype(np.uint8)
|
||||
assert np.allclose(saved_img_data, cv2.cvtColor(expected_scaled_data, cv2.COLOR_RGB2BGR), atol=1) # Allow small diff due to float precision
|
||||
elif input_dtype in [np.float16, np.float32, np.float64]:
|
||||
expected_scaled_data = (np.clip(original_img_data_copy, 0.0, 1.0) * 255.0).astype(np.uint8)
|
||||
assert np.allclose(saved_img_data, cv2.cvtColor(expected_scaled_data, cv2.COLOR_RGB2BGR), atol=1)
|
||||
elif output_dtype_target == np.uint16:
|
||||
if input_dtype == np.uint8:
|
||||
expected_scaled_data = (original_img_data_copy.astype(np.float32) / 255.0 * 65535.0).astype(np.uint16)
|
||||
assert np.allclose(saved_img_data, cv2.cvtColor(expected_scaled_data, cv2.COLOR_RGB2BGR), atol=1)
|
||||
elif input_dtype in [np.float16, np.float32, np.float64]:
|
||||
expected_scaled_data = (np.clip(original_img_data_copy, 0.0, 1.0) * 65535.0).astype(np.uint16)
|
||||
assert np.allclose(saved_img_data, cv2.cvtColor(expected_scaled_data, cv2.COLOR_RGB2BGR), atol=1)
|
||||
# Add more scaling checks for float16, float32 if necessary
|
||||
|
||||
|
||||
# --- Tests for calculate_image_stats ---
|
||||
|
||||
def test_calculate_image_stats_grayscale_uint8():
|
||||
img_data = np.array([[0, 128], [255, 10]], dtype=np.uint8)
|
||||
# Expected normalized: [[0, 0.50196], [1.0, 0.03921]] approx
|
||||
stats = ipu.calculate_image_stats(img_data)
|
||||
assert stats is not None
|
||||
assert np.isclose(stats["min"], 0/255.0)
|
||||
assert np.isclose(stats["max"], 255/255.0)
|
||||
assert np.isclose(stats["mean"], np.mean(img_data.astype(np.float64)/255.0))
|
||||
|
||||
def test_calculate_image_stats_color_uint8():
|
||||
img_data = np.array([
|
||||
[[0, 50, 100], [10, 60, 110]],
|
||||
[[255, 128, 200], [20, 70, 120]]
|
||||
], dtype=np.uint8)
|
||||
stats = ipu.calculate_image_stats(img_data)
|
||||
assert stats is not None
|
||||
# Min per channel (normalized)
|
||||
assert np.allclose(stats["min"], [0/255.0, 50/255.0, 100/255.0])
|
||||
# Max per channel (normalized)
|
||||
assert np.allclose(stats["max"], [255/255.0, 128/255.0, 200/255.0])
|
||||
# Mean per channel (normalized)
|
||||
expected_mean = np.mean(img_data.astype(np.float64)/255.0, axis=(0,1))
|
||||
assert np.allclose(stats["mean"], expected_mean)
|
||||
|
||||
def test_calculate_image_stats_grayscale_uint16():
|
||||
img_data = np.array([[0, 32768], [65535, 1000]], dtype=np.uint16)
|
||||
stats = ipu.calculate_image_stats(img_data)
|
||||
assert stats is not None
|
||||
assert np.isclose(stats["min"], 0/65535.0)
|
||||
assert np.isclose(stats["max"], 65535/65535.0)
|
||||
assert np.isclose(stats["mean"], np.mean(img_data.astype(np.float64)/65535.0))
|
||||
|
||||
def test_calculate_image_stats_color_float32():
|
||||
# Floats are assumed to be in 0-1 range already by the function's normalization logic
|
||||
img_data = np.array([
|
||||
[[0.0, 0.2, 0.4], [0.1, 0.3, 0.5]],
|
||||
[[1.0, 0.5, 0.8], [0.05, 0.25, 0.6]]
|
||||
], dtype=np.float32)
|
||||
stats = ipu.calculate_image_stats(img_data)
|
||||
assert stats is not None
|
||||
assert np.allclose(stats["min"], [0.0, 0.2, 0.4])
|
||||
assert np.allclose(stats["max"], [1.0, 0.5, 0.8])
|
||||
expected_mean = np.mean(img_data.astype(np.float64), axis=(0,1))
|
||||
assert np.allclose(stats["mean"], expected_mean)
|
||||
|
||||
def test_calculate_image_stats_none_input():
|
||||
assert ipu.calculate_image_stats(None) is None
|
||||
|
||||
def test_calculate_image_stats_unsupported_shape():
|
||||
img_data = np.zeros((2,2,2,2), dtype=np.uint8) # 4D array
|
||||
assert ipu.calculate_image_stats(img_data) is None
|
||||
|
||||
@mock.patch('numpy.mean', side_effect=Exception("Numpy error"))
|
||||
def test_calculate_image_stats_exception_during_calculation(mock_np_mean):
|
||||
img_data = np.array([[0, 128], [255, 10]], dtype=np.uint8)
|
||||
stats = ipu.calculate_image_stats(img_data)
|
||||
assert stats == {"error": "Error calculating image stats"}
|
||||
|
||||
# Example of mocking ipu.load_image for a function that uses it (if calculate_image_stats used it)
|
||||
# For the current calculate_image_stats, it takes image_data directly, so this is not needed for it.
|
||||
# This is just an example as requested in the prompt for a hypothetical scenario.
|
||||
@mock.patch('processing.utils.image_processing_utils.load_image')
|
||||
def test_hypothetical_function_using_load_image(mock_load_image):
|
||||
# This test is for a function that would call ipu.load_image internally
|
||||
# e.g. def process_image_from_path(path):
|
||||
# img_data = ipu.load_image(path)
|
||||
# return ipu.calculate_image_stats(img_data)
|
||||
|
||||
mock_img_data = np.array([[[0.5]]], dtype=np.float32)
|
||||
mock_load_image.return_value = mock_img_data
|
||||
|
||||
# result = ipu.hypothetical_process_image_from_path("dummy.png")
|
||||
# mock_load_image.assert_called_once_with("dummy.png")
|
||||
# assert result["mean"] == 0.5
|
||||
pass # This is a conceptual example
|
||||
1
tests/utils/__init__.py
Normal file
1
tests/utils/__init__.py
Normal file
@ -0,0 +1 @@
|
||||
# This file makes the 'tests/utils' directory a Python package.
|
||||
252
tests/utils/test_path_utils.py
Normal file
252
tests/utils/test_path_utils.py
Normal file
@ -0,0 +1,252 @@
|
||||
import pytest
|
||||
from pathlib import Path
|
||||
from utils.path_utils import sanitize_filename, generate_path_from_pattern
|
||||
|
||||
# Tests for sanitize_filename
|
||||
def test_sanitize_filename_valid():
|
||||
assert sanitize_filename("valid_filename.txt") == "valid_filename.txt"
|
||||
|
||||
def test_sanitize_filename_with_spaces():
|
||||
assert sanitize_filename("file name with spaces.txt") == "file_name_with_spaces.txt"
|
||||
|
||||
def test_sanitize_filename_with_special_characters():
|
||||
assert sanitize_filename("file!@#$%^&*()[]{};:'\",.<>/?\\|.txt") == "file____________________.txt"
|
||||
|
||||
def test_sanitize_filename_with_leading_trailing_whitespace():
|
||||
assert sanitize_filename(" filename_with_spaces .txt") == "filename_with_spaces.txt"
|
||||
|
||||
def test_sanitize_filename_empty_string():
|
||||
assert sanitize_filename("") == ""
|
||||
|
||||
def test_sanitize_filename_with_none():
|
||||
with pytest.raises(TypeError):
|
||||
sanitize_filename(None)
|
||||
|
||||
def test_sanitize_filename_mixed_case():
|
||||
assert sanitize_filename("MixedCaseFileName.PNG") == "MixedCaseFileName.PNG"
|
||||
|
||||
def test_sanitize_filename_long_filename():
|
||||
long_name = "a" * 255 + ".txt"
|
||||
# Assuming the function doesn't truncate, but sanitizes.
|
||||
# If it's meant to handle OS limits, this test might need adjustment
|
||||
# based on the function's specific behavior for long names.
|
||||
assert sanitize_filename(long_name) == long_name
|
||||
|
||||
def test_sanitize_filename_unicode_characters():
|
||||
assert sanitize_filename("文件名前缀_文件名_后缀.jpg") == "文件名前缀_文件名_后缀.jpg"
|
||||
|
||||
def test_sanitize_filename_multiple_extensions():
|
||||
assert sanitize_filename("archive.tar.gz") == "archive.tar.gz"
|
||||
|
||||
def test_sanitize_filename_no_extension():
|
||||
assert sanitize_filename("filename") == "filename"
|
||||
|
||||
def test_sanitize_filename_only_special_chars():
|
||||
assert sanitize_filename("!@#$%^") == "______"
|
||||
|
||||
def test_sanitize_filename_with_hyphens_and_underscores():
|
||||
assert sanitize_filename("file-name_with-hyphens_and_underscores.zip") == "file-name_with-hyphens_and_underscores.zip"
|
||||
|
||||
# Tests for generate_path_from_pattern
|
||||
def test_generate_path_basic():
|
||||
result = generate_path_from_pattern(
|
||||
base_path="output",
|
||||
pattern="{asset_name}/{map_type}/{filename}",
|
||||
asset_name="MyAsset",
|
||||
map_type="Diffuse",
|
||||
filename="MyAsset_Diffuse.png",
|
||||
source_rule_name="TestRule",
|
||||
incrementing_value=None,
|
||||
sha5_value=None
|
||||
)
|
||||
expected = Path("output/MyAsset/Diffuse/MyAsset_Diffuse.png")
|
||||
assert Path(result) == expected
|
||||
|
||||
def test_generate_path_all_placeholders():
|
||||
result = generate_path_from_pattern(
|
||||
base_path="project_files",
|
||||
pattern="{source_rule_name}/{asset_name}/{map_type}_{incrementing_value}_{sha5_value}/{filename}",
|
||||
asset_name="AnotherAsset",
|
||||
map_type="Normal",
|
||||
filename="NormalMap.tif",
|
||||
source_rule_name="ComplexRule",
|
||||
incrementing_value="001",
|
||||
sha5_value="abcde"
|
||||
)
|
||||
expected = Path("project_files/ComplexRule/AnotherAsset/Normal_001_abcde/NormalMap.tif")
|
||||
assert Path(result) == expected
|
||||
|
||||
def test_generate_path_optional_placeholders_none():
|
||||
result = generate_path_from_pattern(
|
||||
base_path="data",
|
||||
pattern="{asset_name}/{filename}",
|
||||
asset_name="SimpleAsset",
|
||||
map_type="Albedo", # map_type is in pattern but not used if not in string
|
||||
filename="texture.jpg",
|
||||
source_rule_name="Basic",
|
||||
incrementing_value=None,
|
||||
sha5_value=None
|
||||
)
|
||||
expected = Path("data/SimpleAsset/texture.jpg")
|
||||
assert Path(result) == expected
|
||||
|
||||
def test_generate_path_optional_incrementing_value_present():
|
||||
result = generate_path_from_pattern(
|
||||
base_path="assets",
|
||||
pattern="{asset_name}/{map_type}/v{incrementing_value}/{filename}",
|
||||
asset_name="VersionedAsset",
|
||||
map_type="Specular",
|
||||
filename="spec.png",
|
||||
source_rule_name="VersioningRule",
|
||||
incrementing_value="3",
|
||||
sha5_value=None
|
||||
)
|
||||
expected = Path("assets/VersionedAsset/Specular/v3/spec.png")
|
||||
assert Path(result) == expected
|
||||
|
||||
def test_generate_path_optional_sha5_value_present():
|
||||
result = generate_path_from_pattern(
|
||||
base_path="cache",
|
||||
pattern="{asset_name}/{sha5_value}/{filename}",
|
||||
asset_name="HashedAsset",
|
||||
map_type="Roughness",
|
||||
filename="rough.exr",
|
||||
source_rule_name="HashingRule",
|
||||
incrementing_value=None,
|
||||
sha5_value="f1234"
|
||||
)
|
||||
expected = Path("cache/HashedAsset/f1234/rough.exr")
|
||||
assert Path(result) == expected
|
||||
|
||||
def test_generate_path_base_path_is_path_object():
|
||||
result = generate_path_from_pattern(
|
||||
base_path=Path("output_path"),
|
||||
pattern="{asset_name}/{filename}",
|
||||
asset_name="ObjectAsset",
|
||||
map_type="AO",
|
||||
filename="ao.png",
|
||||
source_rule_name="PathObjectRule",
|
||||
incrementing_value=None,
|
||||
sha5_value=None
|
||||
)
|
||||
expected = Path("output_path/ObjectAsset/ao.png")
|
||||
assert Path(result) == expected
|
||||
|
||||
def test_generate_path_empty_pattern():
|
||||
result = generate_path_from_pattern(
|
||||
base_path="output",
|
||||
pattern="", # Empty pattern should just use base_path and filename
|
||||
asset_name="MyAsset",
|
||||
map_type="Diffuse",
|
||||
filename="MyAsset_Diffuse.png",
|
||||
source_rule_name="TestRule",
|
||||
incrementing_value=None,
|
||||
sha5_value=None
|
||||
)
|
||||
expected = Path("output/MyAsset_Diffuse.png")
|
||||
assert Path(result) == expected
|
||||
|
||||
def test_generate_path_pattern_with_no_placeholders():
|
||||
result = generate_path_from_pattern(
|
||||
base_path="fixed_output",
|
||||
pattern="some/static/path", # Pattern has no placeholders
|
||||
asset_name="MyAsset",
|
||||
map_type="Diffuse",
|
||||
filename="MyAsset_Diffuse.png",
|
||||
source_rule_name="TestRule",
|
||||
incrementing_value=None,
|
||||
sha5_value=None
|
||||
)
|
||||
expected = Path("fixed_output/some/static/path/MyAsset_Diffuse.png")
|
||||
assert Path(result) == expected
|
||||
|
||||
def test_generate_path_filename_with_subdirs_in_pattern():
|
||||
result = generate_path_from_pattern(
|
||||
base_path="output",
|
||||
pattern="{asset_name}", # Filename itself will be appended
|
||||
asset_name="AssetWithSubdirFile",
|
||||
map_type="Color",
|
||||
filename="textures/variant1/color.png", # Filename contains subdirectories
|
||||
source_rule_name="SubdirRule",
|
||||
incrementing_value=None,
|
||||
sha5_value=None
|
||||
)
|
||||
# The function is expected to join pattern result with filename
|
||||
expected = Path("output/AssetWithSubdirFile/textures/variant1/color.png")
|
||||
assert Path(result) == expected
|
||||
|
||||
def test_generate_path_no_filename_provided():
|
||||
# This test assumes that if filename is None or empty, it might raise an error
|
||||
# or behave in a specific way, e.g. not append anything or use a default.
|
||||
# Adjust based on actual function behavior for missing filename.
|
||||
# For now, let's assume it might raise TypeError if filename is critical.
|
||||
with pytest.raises(TypeError): # Or ValueError, depending on implementation
|
||||
generate_path_from_pattern(
|
||||
base_path="output",
|
||||
pattern="{asset_name}/{map_type}",
|
||||
asset_name="MyAsset",
|
||||
map_type="Diffuse",
|
||||
filename=None, # No filename
|
||||
source_rule_name="TestRule",
|
||||
incrementing_value=None,
|
||||
sha5_value=None
|
||||
)
|
||||
|
||||
def test_generate_path_all_values_are_empty_strings_or_none_where_applicable():
|
||||
result = generate_path_from_pattern(
|
||||
base_path="", # Empty base_path
|
||||
pattern="{asset_name}/{map_type}/{incrementing_value}/{sha5_value}",
|
||||
asset_name="", # Empty asset_name
|
||||
map_type="", # Empty map_type
|
||||
filename="empty_test.file",
|
||||
source_rule_name="", # Empty source_rule_name
|
||||
incrementing_value="", # Empty incrementing_value
|
||||
sha5_value="" # Empty sha5_value
|
||||
)
|
||||
# Behavior with empty strings might vary. Assuming they are treated as literal empty segments.
|
||||
# Path("///empty_test.file") might resolve to "/empty_test.file" on POSIX
|
||||
# or just "empty_test.file" if base_path is current dir.
|
||||
# Let's assume Path() handles normalization.
|
||||
# If base_path is "", it means current directory.
|
||||
# So, "//empty_test.file" relative to current dir.
|
||||
# Path objects normalize this. e.g. Path('//a') -> Path('/a') on POSIX
|
||||
# Path('a//b') -> Path('a/b')
|
||||
# Path('/a//b') -> Path('/a/b')
|
||||
# Path('//a//b') -> Path('/a/b')
|
||||
# If base_path is empty, it's like Path('.////empty_test.file')
|
||||
expected = Path("empty_test.file") # Simplified, actual result might be OS dependent or Path lib norm.
|
||||
# More robust check:
|
||||
# result_path = Path(result)
|
||||
# expected_path = Path.cwd() / "" / "" / "" / "" / "empty_test.file" # This is not quite right
|
||||
# Let's assume the function joins them: "" + "/" + "" + "/" + "" + "/" + "" + "/" + "empty_test.file"
|
||||
# which becomes "////empty_test.file"
|
||||
# Path("////empty_test.file") on Windows becomes "\\empty_test.file" (network path attempt)
|
||||
# Path("////empty_test.file") on Linux becomes "/empty_test.file"
|
||||
# Given the function likely uses os.path.join or Path.joinpath,
|
||||
# and base_path="", asset_name="", map_type="", inc_val="", sha5_val=""
|
||||
# pattern = "{asset_name}/{map_type}/{incrementing_value}/{sha5_value}" -> "///"
|
||||
# result = base_path / pattern_result / filename
|
||||
# result = "" / "///" / "empty_test.file"
|
||||
# Path("") / "///" / "empty_test.file" -> Path("///empty_test.file")
|
||||
# This is tricky. Let's assume the function is robust.
|
||||
# If all path segments are empty, it should ideally resolve to just the filename relative to base_path.
|
||||
# If base_path is also empty, then filename relative to CWD.
|
||||
# Let's test the expected output based on typical os.path.join behavior:
|
||||
# os.path.join("", "", "", "", "", "empty_test.file") -> "empty_test.file" on Windows
|
||||
# os.path.join("", "", "", "", "", "empty_test.file") -> "empty_test.file" on Linux
|
||||
assert Path(result) == Path("empty_test.file")
|
||||
|
||||
|
||||
def test_generate_path_with_dots_in_placeholders():
|
||||
result = generate_path_from_pattern(
|
||||
base_path="output",
|
||||
pattern="{asset_name}/{map_type}",
|
||||
asset_name="My.Asset.V1",
|
||||
map_type="Diffuse.Main",
|
||||
filename="texture.png",
|
||||
source_rule_name="DotsRule",
|
||||
incrementing_value=None,
|
||||
sha5_value=None
|
||||
)
|
||||
expected = Path("output/My.Asset.V1/Diffuse.Main/texture.png")
|
||||
assert Path(result) == expected
|
||||
@ -154,6 +154,15 @@ def get_next_incrementing_value(output_base_path: Path, output_directory_pattern
|
||||
logger.info(f"Determined next incrementing value: {next_value_str} (Max found: {max_value})")
|
||||
return next_value_str
|
||||
|
||||
def sanitize_filename(name: str) -> str:
|
||||
"""Removes or replaces characters invalid for filenames/directory names."""
|
||||
if not isinstance(name, str): name = str(name)
|
||||
name = re.sub(r'[^\w.\-]+', '_', name) # Allow alphanumeric, underscore, hyphen, dot
|
||||
name = re.sub(r'_+', '_', name)
|
||||
name = name.strip('_')
|
||||
if not name: name = "invalid_name"
|
||||
return name
|
||||
|
||||
# --- Basic Unit Tests ---
|
||||
if __name__ == "__main__":
|
||||
print("Running basic tests for path_utils.generate_path_from_pattern...")
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user