Dealing with Flattened Zip Archives in a Script-Driven Orchestrator
Working with orchestration tools often comes with unique quirks, especially when integrating infrastructure-as-code tools like Terraform. Recently, I encountered an unexpected challenge with a particular orchestrator that handles shell scripts and zipped artifacts in a less-than-intuitive way.
Packaging Terraform for Orchestration
Unlike more developer-friendly CI/CD platforms like Azure DevOps or GitHub Actions, this orchestrator requires you to package your scripts and dependencies into a zipped artifact. This artifact typically includes your Terraform root module, all subfolders containing local modules, and a custom shell script that executes the Terraform workflow.
Once submitted, the orchestrator deploys the zipped package to a container and runs the script. However, I discovered a major issue during this extraction phase.
The Flattening Problem
Instead of maintaining the original folder hierarchy, the orchestrator flattens the entire directory structure. All the file paths are collapsed into filenames, with directory names embedded into them — essentially mimicking the way files are represented in blob storage. In blob storage, we expect this behavior since folders are simulated using delimiter-encoded filenames. But this behavior is quite jarring in a shell script environment.
To illustrate, here’s how the file structure transforms upon extraction:
Expected hierarchical folder structure:
main.tf
deploy.sh
modules/
networking/
vnet.tf
outputs.tf
compute/
vm.tf
Actual flattened files after extraction:
modules\networking\vnet.tf
modules\networking\outputs.tf
modules\compute\vm.tf
scripts\deploy.sh
main.tf
Here, the backslashes are part of the filename, not indicators of folders — making the filesystem essentially flat and unusable for Terraform in its default form.
Rebuilding the Folder Structure with Bash
To resolve this, I had to reconstruct the directory hierarchy manually. With some helpful input from friends (and more than a little assistance from ChatGPT), I cobbled together a Bash script to reverse-engineer the folder structure from the embedded paths:
# Loop through all files in the current directory
for file in *; do
# Check if it's a regular file
if [[ -f "$file" && "$file" == *\\* ]]; then
# Replace backslashes with forward slashes to get the intended path
new_path="${file//\\//}"
# Extract the directory path (everything before the last slash)
dir_path="${new_path%/*}"
echo 'creating directory path:'$dir_path
# Create the directory path if it doesn't exist
mkdir -p "$dir_path"
echo 'moving file to path:'$new_path
# Move the file to the new directory structure
mv "$file" "$new_path"
fi
done
This script loops over the extracted files, looks for those that contain backslashes (indicating they came from a subfolder), and rebuilds the directory path using mkdir -p. It then moves the file into the reconstructed location.
Is it ugly? Yes. Does it work? Also, Yes. Good enough for me.
That’s all I’ve got to say about that. ^_^
Conclusion
Writing this script was a reminder of how automation tools can sometimes impose frustrating limitations, but also how shell scripting — no matter how awkward it can feel — remains an invaluable skill. This experience made me appreciate the reliability of platforms that respect your artifact structure, but it also gave me a practical workaround that saved the day.
So while I might not be a Bash expert, I now have a script that lets me confidently work with this orchestrator’s quirks, restoring my Terraform modules to a usable state.