Eu escrevi um bash
script que:
- obtém a última
Github
execução de CI/CD - verifica o resumo dos artefatos baixados
- gera um processo para baixar e extrair cada artefato (em paralelo) se o resumo tiver sido alterado
- aguarda até que o processo de download e extração seja concluído e executa novamente o script
Tenho um problema - o script gera as tarefas paralelas sem problemas, mas faz isso a cada 10 segundos. Por algum motivo, isso wait
não está sendo respeitado.
Por que isso acontece e como posso consertar?
Este é meu script:
#!/bin/bash
# GitHub repository details
REPO="RiceKX/Hydra" # Format: owner/repo, e.g., "octocat/Hello-World"
BRANCH="development" # The branch you want to check for the latest build
GITHUB_TOKEN="${GITHUB_TOKEN:-$(cat /path/to/your/token)}" # GitHub token with necessary permissions
GITHUB_API="https://api.github.com"
ARTIFACTS_DIR="latest_dev_build" # Directory to store downloaded artifacts
# Ensure the artifacts directory exists
mkdir -p "$ARTIFACTS_DIR"
# Run the script in an infinite loop
while true; do
# Check if repository, branch, and GitHub token are provided
if [ -z "$REPO" ] || [ -z "$BRANCH" ] || [ -z "$GITHUB_TOKEN" ]; then
echo "Please set the repository, branch, and GitHub token in the script."
exit 1
fi
# Step 1: Fetch the latest successful run for the specified branch
echo "Fetching the latest successful build for branch '$BRANCH'..."
RUN_ID=$(curl -s \
-H "Authorization: token $GITHUB_TOKEN" \
"$GITHUB_API/repos/$REPO/actions/runs?branch=$BRANCH&status=success&per_page=1" | \
jq -r '.workflow_runs[0].id')
if [ "$RUN_ID" == "null" ] || [ -z "$RUN_ID" ]; then
echo "No successful builds found for branch '$BRANCH'."
exit 1
fi
echo "Latest successful run ID: $RUN_ID"
# Step 2: Fetch the artifacts associated with this run
echo "Fetching artifacts for the latest run..."
ARTIFACTS=$(curl -s \
-H "Authorization: token $GITHUB_TOKEN" \
"$GITHUB_API/repos/$REPO/actions/runs/$RUN_ID/artifacts")
# Check if we successfully got the artifacts
if [ -z "$ARTIFACTS" ] || [ "$(echo "$ARTIFACTS" | jq '.artifacts | length')" -eq 0 ]; then
echo "No artifacts found for this run."
exit 1
fi
# Step 3: Extract artifact details and download them in parallel
PIDS=() # Store background process PIDs to wait for them later
echo "$ARTIFACTS" | jq -r '.artifacts[] | "\(.name) \(.archive_download_url) \(.digest)"' | \
while read ARTIFACT_NAME ARTIFACT_DOWNLOAD_URL ARTIFACT_DIGEST; do
if [ -z "$ARTIFACT_NAME" ] || [ -z "$ARTIFACT_DOWNLOAD_URL" ] || [ -z "$ARTIFACT_DIGEST" ]; then
echo "Error: Artifact name, download URL, or digest missing for artifact."
continue
fi
ARTIFACT_PATH="$ARTIFACTS_DIR/$ARTIFACT_NAME.zip"
# Check if the artifact already exists
if [ -f "$ARTIFACT_PATH" ]; then
# Calculate the local digest of the downloaded artifact
LOCAL_DIGEST=$(sha256sum "$ARTIFACT_PATH" | cut -d' ' -f1)
# If the local digest matches the GitHub digest, skip downloading
if [ "$LOCAL_DIGEST" == "$ARTIFACT_DIGEST" ]; then
echo "Artifact $ARTIFACT_NAME is already up to date (digest matches), skipping download."
continue
else
echo "Artifact $ARTIFACT_NAME exists but the digest does not match. Re-downloading..."
fi
fi
# Download and extract each artifact in parallel
(
echo "Downloading and extracting artifact: $ARTIFACT_NAME from $ARTIFACT_DOWNLOAD_URL"
curl -L -o "$ARTIFACT_PATH" \
-H "Authorization: token $GITHUB_TOKEN" \
"$ARTIFACT_DOWNLOAD_URL"
if [ $? -eq 0 ]; then
echo "Artifact $ARTIFACT_NAME downloaded successfully."
# Extract the downloaded artifact
unzip -o "$ARTIFACT_PATH" -d "$HOME"
echo "Artifact $ARTIFACT_NAME extracted to $HOME."
else
echo "Failed to download $ARTIFACT_NAME."
fi
) & # Run this process in the background
PIDS+=($!)
done
# Wait for all background processes to finish before continuing
echo "Waiting for background processes to finish..."
for PID in "${PIDS[@]}"; do
wait "$PID"
done
echo "All artifacts downloaded and extracted successfully."
# Sleep for 10 seconds before running the script again
echo "Waiting for 10 seconds before checking again..."
sleep 10
done