Cleanup of inconsistencies

This commit is contained in:
Rusfort 2025-05-13 03:07:00 +02:00
parent 0de4db1826
commit 35a7221f57
5 changed files with 96 additions and 187 deletions

View File

@ -1,4 +1,4 @@
# Developer Guide: Processing Pipeline Cl# Developer Guide: Processing Pipeline
This document details the step-by-step technical process executed by the asset processing pipeline, which is initiated by the [`ProcessingEngine`](processing_engine.py:73) class (`processing_engine.py`) and orchestrated by the [`PipelineOrchestrator`](processing/pipeline/orchestrator.py:36) (`processing/pipeline/orchestrator.py`). This document details the step-by-step technical process executed by the asset processing pipeline, which is initiated by the [`ProcessingEngine`](processing_engine.py:73) class (`processing_engine.py`) and orchestrated by the [`PipelineOrchestrator`](processing/pipeline/orchestrator.py:36) (`processing/pipeline/orchestrator.py`).
@ -57,12 +57,12 @@ The [`PipelineOrchestrator`](processing/pipeline/orchestrator.py:36) iterates th
* **Context Interaction**: Populates `context.processing_items` and initializes `context.intermediate_results`. * **Context Interaction**: Populates `context.processing_items` and initializes `context.intermediate_results`.
2. **[`RegularMapProcessorStage`](processing/pipeline/stages/regular_map_processor.py:18)** (`processing/pipeline/stages/regular_map_processor.py`): 2. **[`RegularMapProcessorStage`](processing/pipeline/stages/regular_map_processor.py:18)** (`processing/pipeline/stages/regular_map_processor.py`):
* **Responsibility**: (Executed per [`FileRule`](rule_structure.py:5) item) Loads the image data for a single file, determines its potentially suffixed internal map type, applies in-memory transformations (Gloss-to-Rough, Normal Green Invert), and returns the processed image data and details in a [`ProcessedRegularMapData`](processing/pipeline/asset_context.py:23) object. * **Responsibility**: (Executed per [`FileRule`](rule_structure.py:5) item) Checks if the `FileRule.item_type` starts with "MAP_". If not, the item is skipped. Otherwise, it loads the image data for the file, determines its potentially suffixed internal map type (e.g., "MAP_COL-1"), applies in-memory transformations (Gloss-to-Rough, Normal Green Invert) using the shared utility function [`apply_common_map_transformations`](processing/utils/image_processing_utils.py), and returns the processed image data and details in a [`ProcessedRegularMapData`](processing/pipeline/asset_context.py:23) object. The `internal_map_type` in the output reflects any transformations (e.g., "MAP_GLOSS" becomes "MAP_ROUGH").
* **Context Interaction**: Reads from the input [`FileRule`](rule_structure.py:5) and [`Configuration`](configuration.py:68). Returns a [`ProcessedRegularMapData`](processing/pipeline/asset_context.py:23) object which is stored in `context.intermediate_results`. * **Context Interaction**: Reads from the input [`FileRule`](rule_structure.py:5) (checking `item_type`) and [`Configuration`](configuration.py:68). Returns a [`ProcessedRegularMapData`](processing/pipeline/asset_context.py:23) object which is stored in `context.intermediate_results`.
3. **[`MergedTaskProcessorStage`](processing/pipeline/stages/merged_task_processor.py:68)** (`processing/pipeline/stages/merged_task_processor.py`): 3. **[`MergedTaskProcessorStage`](processing/pipeline/stages/merged_task_processor.py:68)** (`processing/pipeline/stages/merged_task_processor.py`):
* **Responsibility**: (Executed per [`MergeTaskDefinition`](processing/pipeline/asset_context.py:16) item) Loads and prepares multiple input images based on the merge task definition (including fallbacks and in-memory transformations), handles dimension mismatches, performs the channel merging operation, and returns the merged image data and details in a [`ProcessedMergedMapData`](processing/pipeline/asset_context.py:35) object. * **Responsibility**: (Executed per [`MergeTaskDefinition`](processing/pipeline/asset_context.py:16) item) Validates that all input map types specified in the merge rule start with "MAP_". If not, the task is failed. Otherwise, it loads and prepares multiple input images based on the merge task definition (including fallbacks and in-memory transformations applied to inputs using [`apply_common_map_transformations`](processing/utils/image_processing_utils.py)), handles dimension mismatches, performs the channel merging operation, and returns the merged image data and details in a [`ProcessedMergedMapData`](processing/pipeline/asset_context.py:35) object. The `output_map_type` of the merged map must also be "MAP_" prefixed in the configuration.
* **Context Interaction**: Reads from the input [`MergeTaskDefinition`](processing/pipeline/asset_context.py:16), `context.workspace_path`, and [`Configuration`](configuration.py:68). Returns a [`ProcessedMergedMapData`](processing/pipeline/asset_context.py:35) object which is stored in `context.intermediate_results`. * **Context Interaction**: Reads from the input [`MergeTaskDefinition`](processing/pipeline/asset_context.py:16) (checking input map types), `context.workspace_path`, and [`Configuration`](configuration.py:68). Returns a [`ProcessedMergedMapData`](processing/pipeline/asset_context.py:35) object which is stored in `context.intermediate_results`.
4. **[`InitialScalingStage`](processing/pipeline/stages/initial_scaling.py:14)** (`processing/pipeline/stages/initial_scaling.py`): 4. **[`InitialScalingStage`](processing/pipeline/stages/initial_scaling.py:14)** (`processing/pipeline/stages/initial_scaling.py`):
* **Responsibility**: (Executed per item) Applies initial scaling (e.g., Power-of-Two downscaling) to the image data from the previous processing stage based on the `initial_scaling_mode` configuration. * **Responsibility**: (Executed per item) Applies initial scaling (e.g., Power-of-Two downscaling) to the image data from the previous processing stage based on the `initial_scaling_mode` configuration.
@ -70,7 +70,7 @@ The [`PipelineOrchestrator`](processing/pipeline/orchestrator.py:36) iterates th
5. **[`SaveVariantsStage`](processing/pipeline/stages/save_variants.py:15)** (`processing/pipeline/stages/save_variants.py`): 5. **[`SaveVariantsStage`](processing/pipeline/stages/save_variants.py:15)** (`processing/pipeline/stages/save_variants.py`):
* **Responsibility**: (Executed per item) Takes the final processed image data (potentially scaled) and configuration, and calls a utility to save the image to temporary files in various resolutions and formats as defined by the configuration. * **Responsibility**: (Executed per item) Takes the final processed image data (potentially scaled) and configuration, and calls a utility to save the image to temporary files in various resolutions and formats as defined by the configuration.
* **Context Interaction**: Takes a [`SaveVariantsInput`](processing/pipeline/asset_context.py:61) object. Returns a [`SaveVariantsOutput`](processing/pipeline/asset_context.py:79) object containing details about the saved temporary files. The orchestrator stores these details in `context.processed_maps_details` for the item. * **Context Interaction**: Takes a [`SaveVariantsInput`](processing/pipeline/asset_context.py:61) object (which includes the "MAP_" prefixed `internal_map_type`). It uses the `get_filename_friendly_map_type` utility to convert this to a "standard type" (e.g., "COL") for output naming. Returns a [`SaveVariantsOutput`](processing/pipeline/asset_context.py:79) object containing details about the saved temporary files. The orchestrator stores these details, including the original "MAP_" prefixed `internal_map_type`, in `context.processed_maps_details` for the item.
### Post-Item Stages ### Post-Item Stages
@ -78,7 +78,7 @@ These stages are executed sequentially once for each asset after the core item p
1. **[`OutputOrganizationStage`](processing/pipeline/stages/output_organization.py:14)** (`processing/pipeline/stages/output_organization.py`): 1. **[`OutputOrganizationStage`](processing/pipeline/stages/output_organization.py:14)** (`processing/pipeline/stages/output_organization.py`):
* **Responsibility**: Determines the final output paths for all processed maps (including variants) and extra files based on configured patterns. It copies the temporary files generated by the core stages to these final destinations, creating directories as needed and respecting overwrite settings. * **Responsibility**: Determines the final output paths for all processed maps (including variants) and extra files based on configured patterns. It copies the temporary files generated by the core stages to these final destinations, creating directories as needed and respecting overwrite settings.
* **Context Interaction**: Reads from `context.processed_maps_details`, `context.files_to_process` (for 'EXTRA' files), `context.output_base_path`, and [`Configuration`](configuration.py:68). Updates entries in `context.processed_maps_details` with final paths and organization status. Populates `context.asset_metadata['final_output_files']`. * **Context Interaction**: Reads from `context.processed_maps_details` (using the "MAP_" prefixed `internal_map_type` to get the "standard type" via `get_filename_friendly_map_type` for output naming), `context.files_to_process` (for 'EXTRA' files), `context.output_base_path`, and [`Configuration`](configuration.py:68). Updates entries in `context.processed_maps_details` with final paths and organization status. Populates `context.asset_metadata['final_output_files']`. (Note: Legacy code for `'Processed_With_Variants'` status has been removed from this stage).
2. **[`MetadataFinalizationAndSaveStage`](processing/pipeline/stages/metadata_finalization_save.py:14)** (`processing/pipeline/stages/metadata_finalization_save.py`): 2. **[`MetadataFinalizationAndSaveStage`](processing/pipeline/stages/metadata_finalization_save.py:14)** (`processing/pipeline/stages/metadata_finalization_save.py`):
* **Responsibility**: Finalizes the `context.asset_metadata` (setting end time, final status based on flags). It restructures the processed map details for inclusion, determines the save path for the metadata file based on configuration and patterns, serializes the metadata to JSON, and saves the `metadata.json` file to the final output location. * **Responsibility**: Finalizes the `context.asset_metadata` (setting end time, final status based on flags). It restructures the processed map details for inclusion, determines the save path for the metadata file based on configuration and patterns, serializes the metadata to JSON, and saves the `metadata.json` file to the final output location.

View File

@ -14,55 +14,6 @@ from ...utils import image_processing_utils as ipu
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
# Helper function (Duplicated from RegularMapProcessorStage - consider moving to utils) # Helper function (Duplicated from RegularMapProcessorStage - consider moving to utils)
def _apply_in_memory_transformations(
image_data: np.ndarray,
processing_map_type: str, # The internal type of the *input* map
invert_normal_green: bool,
file_type_definitions: Dict[str, Dict],
log_prefix: str
) -> Tuple[np.ndarray, str, List[str]]:
"""
Applies in-memory transformations (Gloss-to-Rough, Normal Green Invert).
Returns potentially transformed image data, potentially updated map type, and notes.
NOTE: This is applied to individual inputs *before* merging.
"""
transformation_notes = []
current_image_data = image_data # Start with original data
updated_processing_map_type = processing_map_type # Start with original type
# Gloss-to-Rough
base_map_type_match = re.match(r"(MAP_GLOSS)", processing_map_type)
if base_map_type_match:
log.info(f"{log_prefix}: Applying Gloss-to-Rough conversion to input.")
inversion_succeeded = False
if np.issubdtype(current_image_data.dtype, np.floating):
current_image_data = 1.0 - current_image_data
current_image_data = np.clip(current_image_data, 0.0, 1.0)
log.debug(f"{log_prefix}: Inverted float input data for Gloss->Rough.")
inversion_succeeded = True
elif np.issubdtype(current_image_data.dtype, np.integer):
max_val = np.iinfo(current_image_data.dtype).max
current_image_data = max_val - current_image_data
log.debug(f"{log_prefix}: Inverted integer input data (max_val: {max_val}) for Gloss->Rough.")
inversion_succeeded = True
else:
log.error(f"{log_prefix}: Unsupported image data type {current_image_data.dtype} for GLOSS input map. Cannot invert.")
transformation_notes.append("Gloss-to-Rough FAILED (unsupported dtype)")
if inversion_succeeded:
updated_processing_map_type = processing_map_type.replace("GLOSS", "ROUGH")
log.info(f"{log_prefix}: Input map type conceptually updated: '{processing_map_type}' -> '{updated_processing_map_type}'")
transformation_notes.append("Gloss-to-Rough applied to input")
# Normal Green Invert
base_map_type_match_nrm = re.match(r"(MAP_NRM)", processing_map_type)
if base_map_type_match_nrm and invert_normal_green:
log.info(f"{log_prefix}: Applying Normal Map Green Channel Inversion (Global Setting) to input.")
current_image_data = ipu.invert_normal_map_green_channel(current_image_data)
transformation_notes.append("Normal Green Inverted (Global) applied to input")
# Return the transformed data, the *original* map type (as it identifies the input source), and notes
return current_image_data, processing_map_type, transformation_notes
class MergedTaskProcessorStage(ProcessingStage): class MergedTaskProcessorStage(ProcessingStage):
@ -193,7 +144,7 @@ class MergedTaskProcessorStage(ProcessingStage):
# 3. Apply Transformations to the loaded/fallback input # 3. Apply Transformations to the loaded/fallback input
if input_image_data is not None: if input_image_data is not None:
input_image_data, _, transform_notes = _apply_in_memory_transformations( input_image_data, _, transform_notes = ipu.apply_common_map_transformations(
input_image_data.copy(), # Transform a copy input_image_data.copy(), # Transform a copy
required_map_type_from_rule, # Use the type required by the rule required_map_type_from_rule, # Use the type required by the rule
invert_normal_green, invert_normal_green,

View File

@ -194,83 +194,6 @@ class OutputOrganizationStage(ProcessingStage):
context.asset_metadata['status'] = "Failed (Output Organization Error)" context.asset_metadata['status'] = "Failed (Output Organization Error)"
details['status'] = 'Organization Failed' details['status'] = 'Organization Failed'
# --- Handle legacy 'Processed_With_Variants' status (if still needed, otherwise remove) ---
# This block is kept for potential backward compatibility but might be redundant
# if 'Processed_Via_Save_Utility' is the new standard for variants.
elif map_status == 'Processed_With_Variants':
variants = details.get('variants') # Expects old structure: list of dicts with 'temp_path'
if not variants:
logger.warning(f"Asset '{asset_name_for_log}': Map key '{processed_map_key}' (status '{map_status}') has no 'variants' list. Skipping.")
details['status'] = 'Organization Failed (Legacy Variants Missing)'
continue
logger.debug(f"Asset '{asset_name_for_log}': Organizing {len(variants)} legacy variants for map key '{processed_map_key}' (map type: {base_map_type}).")
map_metadata_entry = context.asset_metadata.setdefault('maps', {}).setdefault(processed_map_key, {})
map_metadata_entry['map_type'] = base_map_type
map_metadata_entry.setdefault('variant_paths', {})
processed_any_variant_successfully = False
failed_any_variant = False
for variant_index, variant_detail in enumerate(variants):
temp_variant_path_str = variant_detail.get('temp_path') # Uses 'temp_path'
if not temp_variant_path_str:
logger.warning(f"Asset '{asset_name_for_log}': Legacy Variant {variant_index} for map '{processed_map_key}' is missing 'temp_path'. Skipping.")
continue
temp_variant_path = Path(temp_variant_path_str)
if not temp_variant_path.is_file():
logger.warning(f"Asset '{asset_name_for_log}': Legacy temporary variant file '{temp_variant_path}' for map '{processed_map_key}' not found. Skipping.")
continue
variant_resolution_key = variant_detail.get('resolution_key', f"varRes{variant_index}")
variant_ext = temp_variant_path.suffix.lstrip('.')
token_data_variant = {
"assetname": asset_name_for_log,
"supplier": context.effective_supplier or "DefaultSupplier",
"maptype": base_map_type,
"resolution": variant_resolution_key,
"ext": variant_ext,
"incrementingvalue": getattr(context, 'incrementing_value', None),
"sha5": getattr(context, 'sha5_value', None)
}
token_data_variant_cleaned = {k: v for k, v in token_data_variant.items() if v is not None}
output_filename_variant = generate_path_from_pattern(output_filename_pattern_config, token_data_variant_cleaned)
try:
relative_dir_path_str_variant = generate_path_from_pattern(
pattern_string=output_dir_pattern,
token_data=token_data_variant_cleaned
)
final_variant_path = Path(context.output_base_path) / Path(relative_dir_path_str_variant) / Path(output_filename_variant)
final_variant_path.parent.mkdir(parents=True, exist_ok=True)
if final_variant_path.exists() and not overwrite_existing:
logger.info(f"Asset '{asset_name_for_log}': Output legacy variant file {final_variant_path} exists and overwrite is disabled. Skipping copy.")
else:
shutil.copy2(temp_variant_path, final_variant_path)
logger.info(f"Asset '{asset_name_for_log}': Copied legacy variant {temp_variant_path} to {final_variant_path}.")
final_output_files.append(str(final_variant_path))
relative_final_variant_path_str = str(Path(relative_dir_path_str_variant) / Path(output_filename_variant))
map_metadata_entry['variant_paths'][variant_resolution_key] = relative_final_variant_path_str
processed_any_variant_successfully = True
except Exception as e:
logger.error(f"Asset '{asset_name_for_log}': Failed to copy legacy variant {temp_variant_path}. Error: {e}", exc_info=True)
context.status_flags['output_organization_error'] = True
context.asset_metadata['status'] = "Failed (Output Organization Error - Legacy Variant)"
failed_any_variant = True
if failed_any_variant:
details['status'] = 'Organization Failed (Legacy Variants)'
elif processed_any_variant_successfully:
details['status'] = 'Organized (Legacy Variants)'
else:
details['status'] = 'Organization Skipped (No Legacy Variants Copied/Needed)'
# --- Handle other statuses (Skipped, Failed, etc.) --- # --- Handle other statuses (Skipped, Failed, etc.) ---
else: # Catches statuses not explicitly handled above else: # Catches statuses not explicitly handled above
logger.debug(f"Asset '{asset_name_for_log}': Skipping map key '{processed_map_key}' (status: '{map_status}') for organization as it's not a recognized final processed state or variant state.") logger.debug(f"Asset '{asset_name_for_log}': Skipping map key '{processed_map_key}' (status: '{map_status}') for organization as it's not a recognized final processed state or variant state.")

View File

@ -91,57 +91,6 @@ class RegularMapProcessorStage(ProcessingStage):
return final_internal_map_type return final_internal_map_type
def _apply_in_memory_transformations(
self,
image_data: np.ndarray,
processing_map_type: str, # The potentially suffixed internal type
invert_normal_green: bool,
file_type_definitions: Dict[str, Dict],
log_prefix: str
) -> Tuple[np.ndarray, str, List[str]]:
"""
Applies in-memory transformations (Gloss-to-Rough, Normal Green Invert).
Returns potentially transformed image data, potentially updated map type, and notes.
"""
transformation_notes = []
current_image_data = image_data # Start with original data
updated_processing_map_type = processing_map_type # Start with original type
# Gloss-to-Rough
# Check if the base type is Gloss (before suffix)
base_map_type_match = re.match(r"(MAP_GLOSS)", processing_map_type)
if base_map_type_match:
log.info(f"{log_prefix}: Applying Gloss-to-Rough conversion.")
inversion_succeeded = False
if np.issubdtype(current_image_data.dtype, np.floating):
current_image_data = 1.0 - current_image_data
current_image_data = np.clip(current_image_data, 0.0, 1.0)
log.debug(f"{log_prefix}: Inverted float image data for Gloss->Rough.")
inversion_succeeded = True
elif np.issubdtype(current_image_data.dtype, np.integer):
max_val = np.iinfo(current_image_data.dtype).max
current_image_data = max_val - current_image_data
log.debug(f"{log_prefix}: Inverted integer image data (max_val: {max_val}) for Gloss->Rough.")
inversion_succeeded = True
else:
log.error(f"{log_prefix}: Unsupported image data type {current_image_data.dtype} for GLOSS map. Cannot invert.")
transformation_notes.append("Gloss-to-Rough FAILED (unsupported dtype)")
if inversion_succeeded:
# Update the type string itself (e.g., MAP_GLOSS-1 -> MAP_ROUGH-1)
updated_processing_map_type = processing_map_type.replace("GLOSS", "ROUGH")
log.info(f"{log_prefix}: Map type updated: '{processing_map_type}' -> '{updated_processing_map_type}'")
transformation_notes.append("Gloss-to-Rough applied")
# Normal Green Invert
# Check if the base type is Normal (before suffix)
base_map_type_match_nrm = re.match(r"(MAP_NRM)", processing_map_type)
if base_map_type_match_nrm and invert_normal_green:
log.info(f"{log_prefix}: Applying Normal Map Green Channel Inversion (Global Setting).")
current_image_data = ipu.invert_normal_map_green_channel(current_image_data)
transformation_notes.append("Normal Green Inverted (Global)")
return current_image_data, updated_processing_map_type, transformation_notes
# --- Execute Method --- # --- Execute Method ---
@ -237,7 +186,7 @@ class RegularMapProcessorStage(ProcessingStage):
result.original_bit_depth = None # Indicate failure to determine result.original_bit_depth = None # Indicate failure to determine
# --- Apply Transformations --- # --- Apply Transformations ---
transformed_image_data, final_map_type, transform_notes = self._apply_in_memory_transformations( transformed_image_data, final_map_type, transform_notes = ipu.apply_common_map_transformations(
source_image_data.copy(), # Pass a copy to avoid modifying original load source_image_data.copy(), # Pass a copy to avoid modifying original load
processing_map_type, processing_map_type,
invert_normal_green, invert_normal_green,

View File

@ -426,4 +426,90 @@ def save_image(
return True return True
except Exception: # as e: except Exception: # as e:
# print(f"Error saving image {path_obj}: {e}") # Optional: for debugging utils # print(f"Error saving image {path_obj}: {e}") # Optional: for debugging utils
return False return False
# --- Common Map Transformations ---
import re
import logging
ipu_log = logging.getLogger(__name__)
def apply_common_map_transformations(
image_data: np.ndarray,
processing_map_type: str, # The potentially suffixed internal type
invert_normal_green: bool,
file_type_definitions: Dict[str, Dict],
log_prefix: str
) -> Tuple[np.ndarray, str, List[str]]:
"""
Applies common in-memory transformations (Gloss-to-Rough, Normal Green Invert).
Returns potentially transformed image data, potentially updated map type, and notes.
"""
transformation_notes = []
current_image_data = image_data # Start with original data
updated_processing_map_type = processing_map_type # Start with original type
# Gloss-to-Rough
# Check if the base type is Gloss (before suffix)
base_map_type_match = re.match(r"(MAP_GLOSS)", processing_map_type)
if base_map_type_match:
ipu_log.info(f"{log_prefix}: Applying Gloss-to-Rough conversion.")
inversion_succeeded = False
if np.issubdtype(current_image_data.dtype, np.floating):
current_image_data = 1.0 - current_image_data
current_image_data = np.clip(current_image_data, 0.0, 1.0)
ipu_log.debug(f"{log_prefix}: Inverted float image data for Gloss->Rough.")
inversion_succeeded = True
elif np.issubdtype(current_image_data.dtype, np.integer):
max_val = np.iinfo(current_image_data.dtype).max
current_image_data = max_val - current_image_data
ipu_log.debug(f"{log_prefix}: Inverted integer image data (max_val: {max_val}) for Gloss->Rough.")
inversion_succeeded = True
else:
ipu_log.error(f"{log_prefix}: Unsupported image data type {current_image_data.dtype} for GLOSS map. Cannot invert.")
transformation_notes.append("Gloss-to-Rough FAILED (unsupported dtype)")
if inversion_succeeded:
# Update the type string itself (e.g., MAP_GLOSS-1 -> MAP_ROUGH-1)
updated_processing_map_type = processing_map_type.replace("GLOSS", "ROUGH")
ipu_log.info(f"{log_prefix}: Map type updated: '{processing_map_type}' -> '{updated_processing_map_type}'")
transformation_notes.append("Gloss-to-Rough applied")
# Normal Green Invert
# Check if the base type is Normal (before suffix)
base_map_type_match_nrm = re.match(r"(MAP_NRM)", processing_map_type)
if base_map_type_match_nrm and invert_normal_green:
ipu_log.info(f"{log_prefix}: Applying Normal Map Green Channel Inversion (Global Setting).")
current_image_data = invert_normal_map_green_channel(current_image_data)
transformation_notes.append("Normal Green Inverted (Global)")
return current_image_data, updated_processing_map_type, transformation_notes
# --- Normal Map Utilities ---
def invert_normal_map_green_channel(normal_map: np.ndarray) -> np.ndarray:
"""
Inverts the green channel of a normal map.
Assumes the normal map is in RGB or RGBA format (channel order R, G, B, A).
"""
if normal_map is None or len(normal_map.shape) < 3 or normal_map.shape[2] < 3:
# Not a valid color image with at least 3 channels
return normal_map
# Ensure data is mutable
inverted_map = normal_map.copy()
# Invert the green channel (index 1)
# Handle different data types
if np.issubdtype(inverted_map.dtype, np.floating):
inverted_map[:, :, 1] = 1.0 - inverted_map[:, :, 1]
elif np.issubdtype(inverted_map.dtype, np.integer):
max_val = np.iinfo(inverted_map.dtype).max
inverted_map[:, :, 1] = max_val - inverted_map[:, :, 1]
else:
# Unsupported dtype, return original
print(f"Warning: Unsupported dtype {inverted_map.dtype} for normal map green channel inversion.")
return normal_map
return inverted_map