Skip to content

Commit

Permalink
Merge pull request #220 from aws-solutions/release/v6.1.4
Browse files Browse the repository at this point in the history
release/v6.1.4
  • Loading branch information
eggoynes authored Sep 14, 2023
2 parents 75ff976 + c5d2787 commit 04e884e
Show file tree
Hide file tree
Showing 36 changed files with 24,565 additions and 22,290 deletions.
11 changes: 9 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,13 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [6.1.4] - 2023-09-14

### Fixed

- Fixed typos ([PR #161](https://github.com/aws-solutions/video-on-demand-on-aws/pull/161), [PR #123](https://github.com/aws-solutions/video-on-demand-on-aws/pull/123), [PR #153](https://github.com/aws-solutions/video-on-demand-on-aws/issues/153))
- Fixed scripts to handle spaces in directory name ([Issue #141](https://github.com/aws-solutions/video-on-demand-on-aws/issues/153), [PR #142](https://github.com/aws-solutions/video-on-demand-on-aws/pull/142))

## [6.1.3] - 2023-07-10

### Changed
Expand Down Expand Up @@ -131,7 +138,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
- Default encryption to SNS topic
- Environment variable to configure the AWS SDK to reuse TCP connections <https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/node-reusing-connections.html>
- SQS queue options at deployment to capture workflow outputs
- Support for accelerated transcoding in Elemental MediaCoonvert: <https://aws.amazon.com/about-aws/whats-new/2019/10/announcing-new-aws-elemental-mediaconvert-features-for-accelerated-transcoding-dash-and-avc-video-quality/>
- Support for accelerated transcoding in AWS Elemental MediaConvert: <https://aws.amazon.com/about-aws/whats-new/2019/10/announcing-new-aws-elemental-mediaconvert-features-for-accelerated-transcoding-dash-and-avc-video-quality/>
- support for Glacier deep archive

### Changed
Expand Down Expand Up @@ -172,7 +179,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0

- Typo in notification event property name (ecodeJobId -> encodeJobId)
- Links to issues in the CONTRIBUTING file
- Anonymous metrics helper to remove ServiceToken (which includes Lambda ARN) from sent data
- Anonymized metrics helper to remove ServiceToken (which includes Lambda ARN) from sent data
- Suffix for supported format (mv4 -> m4v)
- Custom resource to handle WorkflowTrigger parameter update

Expand Down
12 changes: 6 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ The workflow configuration is set at deployment and is defined as environment va
* **MediaConvert_Template_720p:** The name of the SD template in MediaConvert
* **Source:** The name of the source S3 bucket
* **WorkflowName:** Used to tag all of the MediaConvert encoding jobs
* **acceleratedTranscoding** Enabled Accelerated Transocding in MediaConvert. options include ENABLE, DISABLE, PREFERRED. for more detials please see:
* **acceleratedTranscoding** Enabled Accelerated Transcoding in MediaConvert. options include ENABLE, DISABLED, PREFERRED. for more details please see: [Accelerated Transcoding](https://docs.aws.amazon.com/mediaconvert/latest/ug/accelerated-transcoding.html).
* **enableSns** Send SNS notifications for the workflow results.
* **enableSqs** Send the workflow results to an SQS queue

Expand Down Expand Up @@ -108,15 +108,15 @@ At launch the Solution creates 3 MediaConvert job templates which are used as th
- **MediaConvert_Template_1080p**
- **MediaConvert_Template_720p**

By default, the profiler step in the process step function will check the source video height and set the parameter "jobTemplate" to one of the available templates. This variable is then passed to the encoding step which submits a job to Elemental MediaConvert. To customize the encoding templates used by the solution you can either replace the existing templates or you can use the source metadata version of the workflow and define the jobTemplate as part of the source metadata file.
By default, the profiler step in the process step function will check the source video height and set the parameter "jobTemplate" to one of the available templates. This variable is then passed to the encoding step which submits a job to AWS Elemental MediaConvert. To customize the encoding templates used by the solution you can either replace the existing templates or you can use the source metadata version of the workflow and define the jobTemplate as part of the source metadata file.

**To replace the templates:**
1. Use the system templates or create 3 new templates through the MediaConvert console (see the Elemental MediaConvert documentation for details).
1. Use the system templates or create 3 new templates through the MediaConvert console (see the AWS Elemental MediaConvert documentation for details).
2. Update the environment variables for the input validate lambda function with the names of the new templates.

**To define the job template using metadata:**
1. Launch the solution with source metadata parameter. See Appendix E for more details.
2. Use the system templates or create a new template through the MediaConvert console (see the Elemental MediaConvert documentation for details).
2. Use the system templates or create a new template through the MediaConvert console (see the AWS Elemental MediaConvert documentation for details).
3. Add "jobTemplate":"name of the template" to the metadata file, this will overwrite the profiler step in the process Step Functions.

## QVBR Mode
Expand Down Expand Up @@ -149,7 +149,7 @@ For more detail please see [Accelerated Transcoding](https://docs.aws.amazon.com
* **archive-source:** Lambda function to tag the source video in s3 to enable the Glacier lifecycle policy.
* **custom-resource:** Lambda backed CloudFormation custom resource to deploy MediaConvert templates configure S3 event notifications.
* **dynamo:** Lambda function to Update DynamoDB.
* **encode:** Lambda function to submit an encoding job to Elemental MediaConvert.
* **encode:** Lambda function to submit an encoding job to AWS Elemental MediaConvert.
* **error-handler:** Lambda function to handler any errors created by the workflow or MediaConvert.
* **input-validate:** Lambda function to parse S3 event notifications and define the workflow parameters.
* **media-package-assets:** Lambda function to ingest an asset into MediaPackage-VOD.
Expand Down Expand Up @@ -271,7 +271,7 @@ limitations under the License.

***

This solution collects anonymous operational metrics to help AWS improve the
This solution collects Anonymized operational metrics to help AWS improve the
quality of features of the solution. For more information, including how to disable
this capability, please see the [implementation guide](https://docs.aws.amazon.com/solutions/latest/video-on-demand/appendix-h.html).

42 changes: 21 additions & 21 deletions deployment/build-s3-dist.sh
Original file line number Diff line number Diff line change
Expand Up @@ -45,25 +45,25 @@ source_dir="$template_dir/../source"
echo "------------------------------------------------------------------------------"
echo "[Init] Remove any old dist files from previous runs"
echo "------------------------------------------------------------------------------"
rm -rf $template_dist_dir
mkdir -p $template_dist_dir
rm -rf "$template_dist_dir"
mkdir -p "$template_dist_dir"

rm -rf $build_dist_dir
mkdir -p $build_dist_dir
rm -rf "$build_dist_dir"
mkdir -p "$build_dist_dir"

rm -rf $staging_dist_dir
mkdir -p $staging_dist_dir
rm -rf "$staging_dist_dir"
mkdir -p "$staging_dist_dir"

echo "------------------------------------------------------------------------------"
echo "[Init] Install dependencies for the cdk-solution-helper"
echo "------------------------------------------------------------------------------"
cd $template_dir/cdk-solution-helper
cd "$template_dir/cdk-solution-helper"
npm install --production

echo "------------------------------------------------------------------------------"
echo "Download mediainfo binary for AWS Lambda"
echo "------------------------------------------------------------------------------"
cd $source_dir/mediainfo/
cd "$source_dir/mediainfo/"
rm -rf bin/*
curl -O https://mediaarea.net/download/binary/mediainfo/20.09/MediaInfo_CLI_20.09_Lambda.zip
unzip MediaInfo_CLI_20.09_Lambda.zip
Expand All @@ -77,9 +77,9 @@ echo "--------------------------------------------------------------------------
# Make sure user has the newest CDK version
npm uninstall -g aws-cdk && npm install -g aws-cdk@2

cd $source_dir/cdk
cd "$source_dir/cdk"
npm install
cdk synth --output=$staging_dist_dir
cdk synth --output="$staging_dist_dir"
if [ $? -ne 0 ]
then
echo "******************************************************************************"
Expand All @@ -88,33 +88,33 @@ then
exit 1
fi

cd $staging_dist_dir
cd "$staging_dist_dir"
rm tree.json manifest.json cdk.out

echo "------------------------------------------------------------------------------"
echo "Run Cdk Helper and update template placeholders"
echo "------------------------------------------------------------------------------"
mv VideoOnDemand.template.json $template_dist_dir/$solution_name.template
mv VideoOnDemand.template.json "$template_dist_dir/$solution_name.template"

node $template_dir/cdk-solution-helper/index
node "$template_dir/cdk-solution-helper/index"

for file in $template_dist_dir/*.template
for file in "$template_dist_dir"/*.template
do
replace="s/%%BUCKET_NAME%%/$bucket_name/g"
sed -i -e $replace $file
sed -i -e $replace "$file"

replace="s/%%SOLUTION_NAME%%/$solution_name/g"
sed -i -e $replace $file
sed -i -e $replace "$file"

replace="s/%%VERSION%%/$solution_version/g"
sed -i -e $replace $file
sed -i -e $replace "$file"
done

echo "------------------------------------------------------------------------------"
echo "[Packing] Source code artifacts"
echo "------------------------------------------------------------------------------"
# ... For each asset.* source code artifact in the temporary /staging folder...
cd $staging_dist_dir
cd "$staging_dist_dir"
for d in `find . -mindepth 1 -maxdepth 1 -type d`; do
# Rename the artifact, removing the period for handler compatibility
pfname="$(basename -- $d)"
Expand All @@ -133,14 +133,14 @@ for d in `find . -mindepth 1 -maxdepth 1 -type d`; do
cd ..

# Copy the zipped artifact from /staging to /regional-s3-assets
mv $fname.zip $build_dist_dir
mv $fname.zip "$build_dist_dir"
done

echo "------------------------------------------------------------------------------"
echo "[Cleanup] Remove temporary files"
echo "------------------------------------------------------------------------------"
rm -rf $staging_dist_dir
rm -f $template_dist_dir/*.template-e
rm -rf "$staging_dist_dir"
rm -f "$template_dist_dir"/*.template-e

echo "------------------------------------------------------------------------------"
echo "S3 Packaging Complete"
Expand Down
26 changes: 13 additions & 13 deletions deployment/run-unit-tests.sh
Original file line number Diff line number Diff line change
Expand Up @@ -13,10 +13,10 @@ prepare_jest_coverage_report() {

# prepare coverage reports
rm -fr coverage/lcov-report
mkdir -p $coverage_reports_top_path/jest
coverage_report_path=$coverage_reports_top_path/jest/$component_name
rm -fr $coverage_report_path
mv coverage $coverage_report_path
mkdir -p "$coverage_reports_top_path/jest"
coverage_report_path="$coverage_reports_top_path/jest/$component_name"
rm -fr "$coverage_report_path"
mv coverage "$coverage_report_path"
}

run_javascript_test() {
Expand All @@ -27,21 +27,21 @@ run_javascript_test() {
echo "[Test] Run javascript unit test with coverage for $component_name"
echo "------------------------------------------------------------------------------"
echo "cd $component_path"
cd $component_path
cd "$component_path"

# install dependencies
npm install --silent
# run unit tests
npm test

# prepare coverage reports
prepare_jest_coverage_report $component_name
prepare_jest_coverage_report "$component_name"
}

# Get reference for all important folders
template_dir="$PWD"
source_dir="$template_dir/../source"
coverage_reports_top_path=$source_dir/test/coverage-reports
coverage_reports_top_path="$source_dir/test/coverage-reports"

# Test the attached Lambda function
declare -a lambda_packages=(
Expand All @@ -62,9 +62,9 @@ declare -a lambda_packages=(

for lambda_package in "${lambda_packages[@]}"
do
rm -rf $source_dir/$lambda_package/coverage
mkdir $source_dir/$lambda_package/coverage
run_javascript_test $source_dir/$lambda_package $lambda_package
rm -rf "$source_dir/$lambda_package/coverage"
mkdir "$source_dir/$lambda_package/coverage"
run_javascript_test "$source_dir/$lambda_package" $lambda_package

# Check the result of the test and exit if a failure is identified
if [ $? -eq 0 ]
Expand Down Expand Up @@ -96,6 +96,6 @@ rm -rf coverage
pytest --cov=. --cov-report xml:coverage/coverage.xml
# fix source file path
sed -i -- 's/filename\=\"/filename\=\"source\/mediainfo\//g' coverage/coverage.xml
mkdir -p $coverage_reports_top_path/pytest
mv coverage $coverage_reports_top_path/pytest/mediainfo
rm -rf pytests
mkdir -p "$coverage_reports_top_path/pytest"
mv coverage "$coverage_reports_top_path/pytest/mediainfo"
rm -rf pytests
Loading

0 comments on commit 04e884e

Please sign in to comment.