Skip to content

Commit

Permalink
Merge pull request #88 from abikouo/upload_file_to_s3
Browse files Browse the repository at this point in the history
cloud.aws_ops.upload_file_to_s3 - A playbook to upload file into S3
  • Loading branch information
abikouo authored Sep 6, 2023
2 parents 4f90a1e + 9b91d61 commit 8bbc480
Show file tree
Hide file tree
Showing 14 changed files with 265 additions and 0 deletions.
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,7 @@ Name | Description
--- | ---
[cloud.aws_ops.eda](https://github.com/ansible-collections/cloud.aws_ops/blob/main/playbooks/README.md)|A set of playbooks to restore AWS Cloudtrail configurations, created for use with the [cloud.aws_manage_cloudtrail_encryption rulebook](https://github.com/ansible-collections/cloud.aws_ops/blob/main/extensions/eda/rulebooks/AWS_MANAGE_CLOUDTRAIL_ENCRYPTION.md).
[cloud.aws_ops.webapp](https://github.com/ansible-collections/cloud.aws_ops/blob/main/playbooks/webapp/README.md)|A set of playbooks to create, delete, or migrate a webapp on AWS.
[cloud.aws_ops.upload_file_to_s3](https://github.com/ansible-collections/cloud.aws_ops/blob/main/playbooks/UPLOAD_FILE_TO_S3.md)|A playbook to upload a local file to S3.

### Rulebooks
Name | Description
Expand Down
3 changes: 3 additions & 0 deletions changelogs/fragments/20230821-upload_file_to_s3.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
---
minor_changes:
- cloud.aws_ops.upload_file_to_s3 - A playbook to upload file from local filesystem into S3 bucket (https://github.com/redhat-cop/cloud.aws_ops/pull/88).
58 changes: 58 additions & 0 deletions playbooks/UPLOAD_FILE_TO_S3.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
## cloud.aws_ops.upload_file_to_s3

A playbook to upload a local file to S3. When running this playbook the file to upload is expected to be located on the remote host, the controller host is responsible of the PUT operation on the S3 bucket.

## Variables

* **aws_access_key**: The AWS access key to use. _type_: **str**
* **aws_secret_key**: The AWS secret key that corresponds to the access key. _type_: **str**
* **aws_security_token**: The AWS security token if using temporary access and secret keys. _type_: **str**
* **aws_profile**: A named AWS profile to use for authentication. _type_: **str**

* **upload_file_to_s3_bucket_name**: (Required) Name of an existing bucket to upload file in. _type_: **str**
* **upload_file_to_s3_file_path**: (Required) Path to a local file to upload. _type_: **path**
* **upload_file_to_s3_object_name**: Object name inside the bucket. Default to file basename. _type_: **str**
* **upload_file_to_s3_object_permission**: This option lets the user set the canned permissions on the object that is created. _type_: **str**, valid values are: _private_, _public-read_, _public-read-write_, _aws-exec-read_, _authenticated-read_, _bucket-owner-read_, _bucket-owner-full-control_
* **upload_file_to_s3_object_overwrite**: Force overwrite remotely with the object/key. _type_: **bool**.

## Dependencies

- NA

## Example

__vars.yaml__
```yaml
---
aws_profile: sample-profile
upload_file_to_s3_bucket_name: my-test-bucket
upload_file_to_s3_file_path: /path/to/file/on/remote/host
```
__playbook.yaml__
```yaml
---
ansible.builtin.import_playbook: cloud.aws_ops.upload_file_to_s3
```
__inventory.ini__
```
[all]
sample_host ansible_ssh_user=some_user ansible_host=xxx.xxx.xxx.xxx
```

Run the following command:

```shell
ansible-playbook ./playbook.yaml -e "@./vars.yaml" -i inventory.ini
```

## License

GNU General Public License v3.0 or later

See [LICENCE](https://github.com/ansible-collections/cloud.aws_troubleshooting/blob/main/LICENSE) to see the full text.

## Author Information

- Ansible Cloud Content Team
65 changes: 65 additions & 0 deletions playbooks/upload_file_to_s3.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
---
- name: Upload a local file to S3
hosts: all
gather_facts: false

module_defaults:
group/aws:
aws_access_key: "{{ aws_access_key | default(omit) }}"
aws_secret_key: "{{ aws_secret_key | default(omit) }}"
session_token: "{{ aws_security_token | default(omit) }}"
profile: "{{ aws_profile | default(omit) }}"

tasks:
- name: Fail when 'upload_file_to_s3_bucket_name' is undefined
ansible.builtin.fail:
msg: 'S3 bucket should be defined as upload_file_to_s3_bucket_name'
when: upload_file_to_s3_bucket_name is undefined

- name: Fail when 'upload_file_to_s3_file_path' is undefined
ansible.builtin.fail:
msg: 'Path to a local file should be defined as upload_file_to_s3_file_path'
when: upload_file_to_s3_file_path is undefined

- name: Validate that file exists
ansible.builtin.stat:
path: "{{ upload_file_to_s3_file_path }}"
register: _stat

- name: Fail when file does not exist
ansible.builtin.fail:
msg: 'File {{ upload_file_to_s3_file_path }} does not exist.'
when: not _stat.stat.exists

- name: Fetch file from remote host and put into S3 bucket
vars:
upload_file_to_s3_file_basename: "{{ upload_file_to_s3_file_path | basename }}"
block:
- name: Create temporary directory to download file in
ansible.builtin.tempfile:
suffix: .s3_download
register: _tempfile
delegate_to: localhost

- name: Download file from Remote host
ansible.builtin.fetch:
src: "{{ upload_file_to_s3_file_path }}"
dest: "{{ _tempfile.path }}"
flat: true

- name: Upload object into S3
amazon.aws.s3_object:
mode: put
bucket: "{{ upload_file_to_s3_bucket_name }}"
object: "{{ upload_file_to_s3_object_name | default(upload_file_to_s3_file_basename) }}"
src: "{{ _tempfile.path }}"
permission: "{{ upload_file_to_s3_object_permission | default(omit) }}"
overwrite: "{{ upload_file_to_s3_object_overwrite | default(omit) }}"
delegate_to: localhost

always:
- name: Delete temporary directory
ansible.builtin.file:
state: absent
path: "{{ _tempfile.path }}"
delegate_to: localhost
3 changes: 3 additions & 0 deletions tests/integration/targets/test_upload_file_to_s3/aliases
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
cloud/aws
time=50s
playbook/upload_file_to_s3
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
This content has been generated to validate the playbook `cloud.aws_ops.upload_file_to_s3`, it will be update to S3 bucket.
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
[all]
upload_to_s3_host ansible_connection=local ansible_python_interpreter=auto
22 changes: 22 additions & 0 deletions tests/integration/targets/test_upload_file_to_s3/runme.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
#!/usr/bin/env bash

set -eux

function cleanup() {
ansible-playbook teardown.yaml "$@"
exit 1
}

trap 'cleanup "${@}"' ERR

# Create resources for testing
ansible-playbook setup.yaml "$@"

# Upload to S3
ansible-playbook upload_file.yaml -e "@upload_file_vars.yaml" -i ./inventory/upload_file.ini "$@"

# Validate that file has been successfully uploaded as expected
ansible-playbook validate.yaml -e "@upload_file_vars.yaml" "$@"

# Delete resources
ansible-playbook teardown.yaml "$@"
25 changes: 25 additions & 0 deletions tests/integration/targets/test_upload_file_to_s3/setup.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
---
- name: Initialize resources for tests
hosts: localhost
gather_facts: false

vars_files:
- vars/main.yaml

module_defaults:
group/aws:
region: '{{ aws_region }}'
aws_access_key: '{{ aws_access_key }}'
aws_secret_key: '{{ aws_secret_key }}'
security_token: '{{ security_token | default(omit) }}'

tasks:
- name: Create AWS session credentials file
ansible.builtin.template:
src: upload_file_vars.yaml.j2
dest: upload_file_vars.yaml
mode: '0755'

- name: Create S3 bucket
amazon.aws.s3_bucket:
name: "{{ s3_bucket_name }}"
27 changes: 27 additions & 0 deletions tests/integration/targets/test_upload_file_to_s3/teardown.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
---
- name: Delete used for testing
hosts: localhost
gather_facts: false

vars_files:
- vars/main.yaml

module_defaults:
group/aws:
region: '{{ aws_region }}'
aws_access_key: '{{ aws_access_key }}'
aws_secret_key: '{{ aws_secret_key }}'
security_token: '{{ security_token | default(omit) }}'

tasks:

- name: Delete S3 bucket
amazon.aws.s3_bucket:
name: "{{ s3_bucket_name }}"
force: true
state: absent

- name: Delete vars file
ansible.builtin.file:
path: upload_file_vars.yaml
state: absent
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
---
aws_access_key: "{{ aws_access_key }}"
aws_access_key: "{{ aws_secret_key }}"
{% if security_token is defined %}
aws_security_token: "{{ security_token }}"
{% endif %}
upload_file_to_s3_bucket_name: "{{ s3_bucket_name }}"
upload_file_to_s3_file_path: "{{ playbook_dir }}/files/test_object.txt"
upload_file_to_s3_object_name: "{{ s3_object_name }}"
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
---
- name: Import playbook 'cloud.aws.upload_file_to_s3'
ansible.builtin.import_playbook: cloud.aws_ops.upload_file_to_s3
43 changes: 43 additions & 0 deletions tests/integration/targets/test_upload_file_to_s3/validate.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
---
- name: Delete used for testing
hosts: localhost
gather_facts: false

vars_files:
- vars/main.yaml

module_defaults:
group/aws:
region: '{{ aws_region }}'
aws_access_key: '{{ aws_access_key }}'
aws_secret_key: '{{ aws_secret_key }}'
security_token: '{{ security_token | default(omit) }}'

tasks:
- name: Create temporary file to download object content in
ansible.builtin.tempfile:
suffix: .s3
register: _tmpfile

- name: Download object from S3 and validate content
block:
- name: Read object from S3 bucket
amazon.aws.s3_object:
bucket: "{{ s3_bucket_name }}"
dest: "{{ _tmpfile.path }}"
mode: get
object: "{{ s3_object_name }}"

- name: Validate content
ansible.builtin.assert:
that:
- s3_content == local_content
vars:
s3_content: "{{ lookup('file', _tmpfile.path) }}"
local_content: "{{ lookup('file', upload_file_to_s3_file_path) }}"

always:
- name: Delete temporary file
ansible.builtin.file:
path: "{{ _tmpfile.path }}"
state: absent
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
---
s3_bucket_name: "bucket-{{ resource_prefix }}"
s3_object_name: "key-{{ resource_prefix }}"

0 comments on commit 8bbc480

Please sign in to comment.