Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix uninstall test for protected packages #98

Merged
merged 2 commits into from
Jul 4, 2024
Merged

Conversation

Korulag
Copy link
Contributor

@Korulag Korulag commented Jul 3, 2024

Copy link

github-actions bot commented Jul 3, 2024

33 passed

Code Coverage Summary

Package Line Rate
alts.scheduler 0%
alts.shared 87%
alts.shared.uploaders 37%
alts.shared.utils 52%
alts.worker 6%
alts.worker.executors 73%
alts.worker.runners 28%
Summary 37% (914 / 2457)

Linter reports

Pylint report
************* Module alts.worker.runners.base
alts/worker/runners/base.py:305:0: C0301: Line too long (81/80) (line-too-long)
alts/worker/runners/base.py:636:0: C0301: Line too long (86/80) (line-too-long)
alts/worker/runners/base.py:857:0: C0301: Line too long (81/80) (line-too-long)
alts/worker/runners/base.py:862:0: C0301: Line too long (97/80) (line-too-long)
alts/worker/runners/base.py:1055:0: C0301: Line too long (87/80) (line-too-long)
alts/worker/runners/base.py:1121:0: C0301: Line too long (81/80) (line-too-long)
alts/worker/runners/base.py:1556:0: C0301: Line too long (82/80) (line-too-long)
alts/worker/runners/base.py:1680:0: C0301: Line too long (93/80) (line-too-long)
alts/worker/runners/base.py:1718:0: C0301: Line too long (91/80) (line-too-long)
alts/worker/runners/base.py:1:0: C0302: Too many lines in module (1738/1000) (too-many-lines)
alts/worker/runners/base.py:441:5: W0511: TODO: Think of better implementation (fixme)
alts/worker/runners/base.py:449:5: W0511: TODO: Think of better implementation (fixme)
alts/worker/runners/base.py:106:0: C0116: Missing function or method docstring (missing-function-docstring)
alts/worker/runners/base.py:118:16: W0212: Access to a protected member _raise_if_aborted of a client class (protected-access)
alts/worker/runners/base.py:119:19: W0212: Access to a protected member _work_dir of a client class (protected-access)
alts/worker/runners/base.py:119:56: W0212: Access to a protected member _work_dir of a client class (protected-access)
alts/worker/runners/base.py:129:21: W0212: Access to a protected member _artifacts of a client class (protected-access)
alts/worker/runners/base.py:136:39: W0212: Access to a protected member _artifacts of a client class (protected-access)
alts/worker/runners/base.py:137:20: W0212: Access to a protected member _artifacts of a client class (protected-access)
alts/worker/runners/base.py:138:25: W0212: Access to a protected member _artifacts of a client class (protected-access)
alts/worker/runners/base.py:144:12: W0212: Access to a protected member _stats of a client class (protected-access)
alts/worker/runners/base.py:150:16: W0212: Access to a protected member _logger of a client class (protected-access)
alts/worker/runners/base.py:157:12: W0212: Access to a protected member _logger of a client class (protected-access)
alts/worker/runners/base.py:115:8: R1710: Either all return statements in a function should return an expression, or none of them should. (inconsistent-return-statements)
alts/worker/runners/base.py:166:0: R0205: Class 'BaseRunner' inherits from object, can be safely removed from bases in python3 (useless-object-inheritance)
alts/worker/runners/base.py:589:19: W0718: Catching too general exception Exception (broad-exception-caught)
alts/worker/runners/base.py:578:4: R1710: Either all return statements in a function should return an expression, or none of them should. (inconsistent-return-statements)
alts/worker/runners/base.py:623:4: C0116: Missing function or method docstring (missing-function-docstring)
alts/worker/runners/base.py:697:4: R0914: Too many local variables (16/15) (too-many-locals)
alts/worker/runners/base.py:718:19: W0718: Catching too general exception Exception (broad-exception-caught)
alts/worker/runners/base.py:742:4: C0116: Missing function or method docstring (missing-function-docstring)
alts/worker/runners/base.py:742:4: R0914: Too many local variables (19/15) (too-many-locals)
alts/worker/runners/base.py:844:27: W0612: Unused variable 'stderr' (unused-variable)
alts/worker/runners/base.py:1077:27: W0612: Unused variable 'stderr' (unused-variable)
alts/worker/runners/base.py:1131:12: R1724: Unnecessary "else" after "continue", remove the "else" and de-indent the code inside it (no-else-continue)
alts/worker/runners/base.py:1141:4: R1710: Either all return statements in a function should return an expression, or none of them should. (inconsistent-return-statements)
alts/worker/runners/base.py:1181:4: R0914: Too many local variables (16/15) (too-many-locals)
alts/worker/runners/base.py:1249:15: W0718: Catching too general exception Exception (broad-exception-caught)
alts/worker/runners/base.py:1263:4: C0116: Missing function or method docstring (missing-function-docstring)
alts/worker/runners/base.py:1263:4: R0914: Too many local variables (20/15) (too-many-locals)
alts/worker/runners/base.py:1336:4: C0116: Missing function or method docstring (missing-function-docstring)
alts/worker/runners/base.py:1401:4: R1710: Either all return statements in a function should return an expression, or none of them should. (inconsistent-return-statements)
alts/worker/runners/base.py:1424:19: W0718: Catching too general exception Exception (broad-exception-caught)
alts/worker/runners/base.py:1448:15: W0718: Catching too general exception Exception (broad-exception-caught)
alts/worker/runners/base.py:1453:19: W0718: Catching too general exception Exception (broad-exception-caught)
alts/worker/runners/base.py:166:0: R0904: Too many public methods (49/20) (too-many-public-methods)
alts/worker/runners/base.py:1527:0: C0115: Missing class docstring (missing-class-docstring)
alts/worker/runners/base.py:1527:0: W0223: Method '_render_tf_main_file' is abstract in class 'BaseRunner' but is not overridden in child class 'GenericVMRunner' (abstract-method)
alts/worker/runners/base.py:1527:0: W0223: Method '_render_tf_variables_file' is abstract in class 'BaseRunner' but is not overridden in child class 'GenericVMRunner' (abstract-method)
alts/worker/runners/base.py:1655:12: W0702: No exception type(s) specified (bare-except)
alts/worker/runners/base.py:1664:4: R1710: Either all return statements in a function should return an expression, or none of them should. (inconsistent-return-statements)
alts/worker/runners/base.py:1678:16: W0612: Unused variable 'attempt' (unused-variable)

-----------------------------------
Your code has been rated at 9.37/10


Black report
--- alts/worker/runners/base.py	2024-07-04 13:20:09.370111+00:00
+++ alts/worker/runners/base.py	2024-07-04 13:20:55.772033+00:00
@@ -93,16 +93,20 @@
     '.sh': ShellExecutor,
     '.yml': AnsibleExecutor,
     '.yaml': AnsibleExecutor,
 }
 
-DetectExecutorResult = Type[Optional[Union[
-    AnsibleExecutor,
-    BatsExecutor,
-    CommandExecutor,
-    ShellExecutor,
-]]]
+DetectExecutorResult = Type[
+    Optional[
+        Union[
+            AnsibleExecutor,
+            BatsExecutor,
+            CommandExecutor,
+            ShellExecutor,
+        ]
+    ]
+]
 
 
 def command_decorator(
     artifacts_key,
     error_message,
@@ -300,20 +304,20 @@
         return self._test_env.get('use_deprecated_ansible', False)
 
     @property
     def ansible_binary(self) -> str:
         if self.use_deprecated_ansible:
-            return os.path.join(CONFIG.deprecated_ansible_venv, 'bin', 'ansible')
+            return os.path.join(
+                CONFIG.deprecated_ansible_venv, 'bin', 'ansible'
+            )
         return 'ansible'
 
     @property
     def ansible_playbook_binary(self) -> str:
         if self.use_deprecated_ansible:
             return os.path.join(
-                CONFIG.deprecated_ansible_venv,
-                'bin',
-                'ansible-playbook'
+                CONFIG.deprecated_ansible_venv, 'bin', 'ansible-playbook'
             )
         return 'ansible-playbook'
 
     @property
     def vm_disk_size(self) -> int:
@@ -385,12 +389,13 @@
                 parsed.path,
                 parsed.params,
                 parsed.query,
                 parsed.fragment,
             ))
-            if (self.dist_name in CONFIG.debian_flavors
-                    and not repo['url'].startswith('deb ')):
+            if self.dist_name in CONFIG.debian_flavors and not repo[
+                'url'
+            ].startswith('deb '):
                 url = f'deb {url} ./'
                 self._logger.info('Modified repo url: %s', url)
             repo['url'] = url
         self._logger.info('Repositories: %s', self._repositories)
 
@@ -439,13 +444,11 @@
             pass
 
     # TODO: Think of better implementation
     def _create_work_dir(self):
         if not self._work_dir or not os.path.exists(self._work_dir):
-            self._work_dir = Path(
-                tempfile.mkdtemp(prefix=self.TEMPFILE_PREFIX)
-            )
+            self._work_dir = Path(tempfile.mkdtemp(prefix=self.TEMPFILE_PREFIX))
         return self._work_dir
 
     # TODO: Think of better implementation
     def _create_artifacts_dir(self):
         if not self._work_dir:
@@ -513,12 +516,14 @@
             if (
                 self.dist_name in CONFIG.rhel_flavors
                 and self.dist_version in ('8', '9', '10')
                 and package_version
             ):
-                full_pkg_name = (f'{package_name}{delimiter}{package_epoch}:'
-                                 f'{package_version}')
+                full_pkg_name = (
+                    f'{package_name}{delimiter}{package_epoch}:'
+                    f'{package_version}'
+                )
         return full_pkg_name
 
     # First step
     def prepare_work_dir_files(self):
         # In case if you've removed worker folder, recreate one
@@ -562,13 +567,17 @@
             'container_name': str(self.env_name),
         }
 
     def __terraform_init(self):
         with FileLock(TF_INIT_LOCK_PATH, timeout=60, thread_local=False):
-            return local['terraform'].with_cwd(self._work_dir).run(
-                ('init', '-no-color'),
-                timeout=CONFIG.provision_timeout,
+            return (
+                local['terraform']
+                .with_cwd(self._work_dir)
+                .run(
+                    ('init', '-no-color'),
+                    timeout=CONFIG.provision_timeout,
+                )
             )
 
     # After: prepare_work_dir_files
     @command_decorator(
         'initialize_terraform',
@@ -606,14 +615,18 @@
         )
         self._logger.debug('Running "terraform apply --auto-approve" command')
         cmd_args = ['apply', '--auto-approve', '-no-color']
         if self.TF_VARIABLES_FILE:
             cmd_args.extend(['--var-file', self.TF_VARIABLES_FILE])
-        return local['terraform'].with_cwd(self._work_dir).run(
-            args=cmd_args,
-            retcode=None,
-            timeout=CONFIG.provision_timeout,
+        return (
+            local['terraform']
+            .with_cwd(self._work_dir)
+            .run(
+                args=cmd_args,
+                retcode=None,
+                timeout=CONFIG.provision_timeout,
+            )
         )
 
     # After: start_env
     @command_decorator(
         'initial_provision',
@@ -631,11 +644,14 @@
             'pytest_is_needed': self.pytest_is_needed,
             'development_mode': CONFIG.development_mode,
             'package_proxy': CONFIG.package_proxy,
         }
         dist_major_version = self.dist_version[0]
-        if self.dist_name in CONFIG.rhel_flavors and dist_major_version in ('6', '7'):
+        if self.dist_name in CONFIG.rhel_flavors and dist_major_version in (
+            '6',
+            '7',
+        ):
             epel_release_url = CONFIG.epel_release_urls.get(dist_major_version)
             if epel_release_url:
                 var_dict['epel_release_url'] = epel_release_url
         if CONFIG.centos_baseurl:
             var_dict['centos_repo_baseurl'] = CONFIG.centos_baseurl
@@ -656,39 +672,57 @@
         self._logger.debug(
             'Running "ansible-playbook %s" command',
             cmd_args_str,
         )
         try:
-            return local[self.ansible_playbook_binary].with_cwd(
-                self._work_dir).run(
+            return (
+                local[self.ansible_playbook_binary]
+                .with_cwd(self._work_dir)
+                .run(
                     args=cmd_args,
                     retcode=None,
                     timeout=CONFIG.provision_timeout,
                 )
+            )
         except ProcessTimedOut as e:
             return 1, '', f'Provision has timed out: {e}'
         except ProcessExecutionError as e:
             return 1, '', f'Provision exited abnormally: {e}'
 
     def get_system_info_commands_list(self) -> Dict[str, tuple]:
         self._logger.debug('Returning default system info commands list')
         basic_commands = BASE_SYSTEM_INFO_COMMANDS.copy()
         if self._dist_name in CONFIG.rhel_flavors:
             basic_commands['Installed packages'] = ('rpm', '-qa')
-            basic_commands['Repositories list'] = (
-                 self.pkg_manager, 'repolist'
-            )
+            basic_commands['Repositories list'] = (self.pkg_manager, 'repolist')
             basic_commands['Repositories details'] = (
-                'find', '/etc/yum.repos.d/', '-type', 'f',
-                '-exec', 'cat', '{}', '+'
+                'find',
+                '/etc/yum.repos.d/',
+                '-type',
+                'f',
+                '-exec',
+                'cat',
+                '{}',
+                '+',
             )
         else:
             basic_commands['Installed packages'] = ('dpkg', '-l')
             basic_commands['Repositories list'] = ('apt-cache', 'policy')
             basic_commands['Repositories details'] = (
-                'find', '/etc/apt/', '-type', 'f', '-name', '*.list*',
-                '-o', '-name', '*.sources*', '-exec', 'cat', '{}', '+'
+                'find',
+                '/etc/apt/',
+                '-type',
+                'f',
+                '-name',
+                '*.list*',
+                '-o',
+                '-name',
+                '*.sources*',
+                '-exec',
+                'cat',
+                '{}',
+                '+',
             )
         return basic_commands
 
     @command_decorator(
         'system_info',
@@ -701,14 +735,11 @@
         error_output = ''
         executor_params = self.get_test_executor_params()
         executor_params['timeout'] = CONFIG.commands_exec_timeout
         for section, cmd in self.get_system_info_commands_list().items():
             start = datetime.datetime.utcnow()
-            self._logger.info(
-                'Running "%s" for env %s',
-                cmd, self.env_name
-            )
+            self._logger.info('Running "%s" for env %s', cmd, self.env_name)
             try:
                 binary, *args = cmd
                 result = CommandExecutor(binary, **executor_params).run(args)
                 output = '\n'.join([result.stdout, result.stderr])
                 if result.is_successful():
@@ -718,11 +749,13 @@
             except Exception as e:
                 errored_commands[section] = str(e)
             finish = datetime.datetime.utcnow()
             self._logger.info(
                 '"%s" for env %s took %s',
-                cmd, self.env_name, str(finish - start)
+                cmd,
+                self.env_name,
+                str(finish - start),
             )
         success_output = '\n\n'.join((
             section + '\n' + section_out
             for section, section_out in successful_commands.items()
         ))
@@ -791,14 +824,18 @@
             'Running "ansible-playbook %s" command',
             cmd_args_str,
         )
         try:
             cmd = self.ansible_playbook_binary
-            exit_code, stdout, stderr = local[cmd].with_cwd(self._work_dir).run(
-                args=cmd_args,
-                retcode=None,
-                timeout=CONFIG.provision_timeout,
+            exit_code, stdout, stderr = (
+                local[cmd]
+                .with_cwd(self._work_dir)
+                .run(
+                    args=cmd_args,
+                    retcode=None,
+                    timeout=CONFIG.provision_timeout,
+                )
             )
         except (ProcessExecutionError, ProcessTimedOut) as e:
             self._logger.error('Cannot install package: %s', str(e))
             stdout = ''
             stderr = f'Failed to install package: \n{e}'
@@ -848,27 +885,35 @@
             return []
         files = [i.strip() for i in stdout.split('\n') if i.strip()]
         protected = []
         for file_ in files:
             exit_code, stdout, stderr = self.exec_command(
-                'cat', f'/etc/{self.pkg_manager}/protected.d/{file_}',
+                'cat',
+                f'/etc/{self.pkg_manager}/protected.d/{file_}',
             )
             if exit_code != 0:
                 continue
-            file_protected = [i.strip() for i in stdout.split('\n') if i.strip()]
+            file_protected = [
+                i.strip() for i in stdout.split('\n') if i.strip()
+            ]
             if file_protected:
                 protected.extend(file_protected)
         protected.append('kernel-core')
         dnf_command = (
-            r'dnf', '-q', '--qf=%{NAME}', 'repoquery', '--requires', '--resolve', '--recursive' +
-            ' '.join(protected)
+            r'dnf',
+            '-q',
+            '--qf=%{NAME}',
+            'repoquery',
+            '--requires',
+            '--resolve',
+            '--recursive' + ' '.join(protected),
         )
         exit_code, stdout, stderr = self.exec_command(*dnf_command)
         if exit_code != 0:
             self._logger.warning(
                 'Cannot resolve non-uninstallable packages via DNF: %s',
-                dnf_command
+                dnf_command,
             )
             return protected
         dnf_protected = [i.strip() for i in stdout.split('\n') if i.strip()]
         if dnf_protected:
             protected.extend(dnf_protected)
@@ -907,14 +952,18 @@
             'Running "ansible-playbook %s" command',
             cmd_args_str,
         )
         cmd = self.ansible_playbook_binary
         try:
-            return local[cmd].with_cwd(self._work_dir).run(
-                args=cmd_args,
-                retcode=None,
-                timeout=CONFIG.provision_timeout,
+            return (
+                local[cmd]
+                .with_cwd(self._work_dir)
+                .run(
+                    args=cmd_args,
+                    retcode=None,
+                    timeout=CONFIG.provision_timeout,
+                )
             )
         except (ProcessExecutionError, ProcessTimedOut) as e:
             self._logger.error('Cannot uninstall package:\n%s', str(e))
             return 1, '', f'Cannot uninstall package:\n{e}'
 
@@ -980,33 +1029,35 @@
         self._logger.info(
             'Running package integrity tests for %s on %s...',
             full_pkg_name,
             self.env_name,
         )
-        return local['py.test'].with_cwd(self._integrity_tests_dir).run(
-            args=cmd_args,
-            retcode=None,
-            timeout=CONFIG.tests_exec_timeout,
+        return (
+            local['py.test']
+            .with_cwd(self._integrity_tests_dir)
+            .run(
+                args=cmd_args,
+                retcode=None,
+                timeout=CONFIG.tests_exec_timeout,
+            )
         )
 
     @staticmethod
     def prepare_gerrit_repo_url(url: str) -> str:
         parsed = urllib.parse.urlparse(url)
         if CONFIG.gerrit_username:
             netloc = f'{CONFIG.gerrit_username}@{parsed.netloc}'
         else:
             netloc = parsed.netloc
-        return urllib.parse.urlunparse(
-            (
-                parsed.scheme,
-                netloc,
-                parsed.path,
-                parsed.params,
-                parsed.query,
-                parsed.fragment,
-            )
-        )
+        return urllib.parse.urlunparse((
+            parsed.scheme,
+            netloc,
+            parsed.path,
+            parsed.params,
+            parsed.query,
+            parsed.fragment,
+        ))
 
     def clone_third_party_repo(
         self,
         repo_url: str,
         git_ref: str,
@@ -1024,37 +1075,41 @@
         if not repo_name.endswith('.git'):
             repo_name += '.git'
         repo_reference_dir = None
         if CONFIG.git_reference_directory:
             repo_reference_dir = os.path.join(
-                CONFIG.git_reference_directory, repo_name)
+                CONFIG.git_reference_directory, repo_name
+            )
         repo_path = None
         for attempt in range(1, 6):
             try:
                 repo_path = func(
                     repo_url,
                     git_ref,
                     self._work_dir,
                     self._logger,
-                    reference_directory=repo_reference_dir
+                    reference_directory=repo_reference_dir,
                 )
             except (ProcessExecutionError, ProcessTimedOut):
                 pass
             if not repo_path:
                 self._logger.warning(
                     'Attempt %d to clone %s locally has failed',
-                    attempt, repo_url
+                    attempt,
+                    repo_url,
                 )
                 self._logger.debug('Sleeping before making another attempt')
                 time.sleep(random.randint(5, 10))
             else:
                 break
         return repo_path
 
     def run_third_party_test(
         self,
-        executor: Union[AnsibleExecutor, BatsExecutor, CommandExecutor, ShellExecutor],
+        executor: Union[
+            AnsibleExecutor, BatsExecutor, CommandExecutor, ShellExecutor
+        ],
         cmd_args: List[str],
         docker_args: Optional[List[str]] = None,
         workdir: str = '',
         artifacts_key: str = '',
         additional_section_name: str = '',
@@ -1068,12 +1123,16 @@
         package_version: Optional[str] = None,
     ) -> bool:
         if self.dist_name in CONFIG.rhel_flavors:
             cmd = ('rpm', '-q', package_name)
         elif self.dist_name in CONFIG.debian_flavors:
-            cmd = ('dpkg-query', '-Wf', r'${db:Status-Status} ${Package}\n',
-                   package_name)
+            cmd = (
+                'dpkg-query',
+                '-Wf',
+                r'${db:Status-Status} ${Package}\n',
+                package_name,
+            )
         else:
             raise ValueError(f'Unknown distribution: {self.dist_name}')
         exit_code, stdout, stderr = self.exec_command(*cmd)
         installed = exit_code == 0
         if installed and package_version:
@@ -1093,11 +1152,11 @@
         if not package_installed:
             self.install_package_no_log(
                 package_name,
                 package_version=package_version,
                 package_epoch=package_epoch,
-                semi_verbose=True
+                semi_verbose=True,
             )
 
     def get_init_script(self, tests_dir: Path) -> Optional[Path]:
         init = None
         for test in tests_dir.iterdir():
@@ -1116,11 +1175,19 @@
     def find_tests(self, tests_dir: str) -> List[Path]:
         self._logger.info('Looking tests on the remote in %s', tests_dir)
         if not tests_dir.endswith('/'):
             tests_dir += '/'
         _, stdout, _ = self.exec_command(
-            'find', tests_dir, '-maxdepth', '1', '-type', 'f', '-o', '-type', 'l'
+            'find',
+            tests_dir,
+            '-maxdepth',
+            '1',
+            '-type',
+            'f',
+            '-o',
+            '-type',
+            'l',
         )
         tests_list = [Path(i) for i in stdout.split('\n')]
         self._logger.debug('Tests list: %s', tests_list)
         tests_list.sort()
         organized_tests_list = []
@@ -1155,16 +1222,17 @@
             if re.search(regex, magic_out, re.IGNORECASE):
                 return executor_class_  # noqa
         return ShellExecutor  # noqa
 
     def detect_python_binary(
-        self,
-        test_path: Union[Path, str]
+        self, test_path: Union[Path, str]
     ) -> Tuple[str, str]:
         default_python = 'python3'
-        if (self.dist_name in CONFIG.rhel_flavors
-                and self.dist_version.startswith(('6', '7'))):
+        if (
+            self.dist_name in CONFIG.rhel_flavors
+            and self.dist_version.startswith(('6', '7'))
+        ):
             default_python = 'python'
         with open(test_path, 'rt') as f:
             shebang = f.readline()
             result = INTERPRETER_REGEX.search(shebang)
             if not result:
@@ -1192,19 +1260,14 @@
         errors = []
         executor_class = self.detect_executor(
             os.path.join(remote_workdir, test_file.name)
         )
         if not executor_class:
-            self._logger.warning(
-                'Cannot get executor for test %s',
-                test_file
-            )
+            self._logger.warning('Cannot get executor for test %s', test_file)
             return errors
         self._logger.info('Running %s', test_file)
-        self._logger.debug(
-            'Executor: %s', executor_class.__name__
-        )
+        self._logger.debug('Executor: %s', executor_class.__name__)
         if executor_class == AnsibleExecutor:
             cmd_args = [test_file]
             workdir = local_workdir
             executor_params['binary_name'] = self.ansible_playbook_binary
         else:
@@ -1310,12 +1373,14 @@
             tests_list = self.find_tests(remote_workdir)
             # Check if package has 0_init-like script
             for test_file in tests_list:
                 if tests_to_run and test_file.name not in tests_to_run:
                     continue
-                if (('0_init' not in test_file.name
-                     or '0_install' not in test_file.name)):
+                if (
+                    '0_init' not in test_file.name
+                    or '0_install' not in test_file.name
+                ):
                     self.ensure_package_is_installed(
                         package_name,
                         package_version=package_version,
                         package_epoch=package_epoch,
                     )
@@ -1408,14 +1473,18 @@
                 'Running "terraform destroy --auto-approve" command'
             )
             cmd_args = ['destroy', '--auto-approve', '-no-color']
             if self.TF_VARIABLES_FILE:
                 cmd_args.extend(['--var-file', self.TF_VARIABLES_FILE])
-            return local['terraform'].with_cwd(self._work_dir).run(
-                args=cmd_args,
-                retcode=None,
-                timeout=CONFIG.provision_timeout,
+            return (
+                local['terraform']
+                .with_cwd(self._work_dir)
+                .run(
+                    args=cmd_args,
+                    retcode=None,
+                    timeout=CONFIG.provision_timeout,
+                )
             )
 
     def erase_work_dir(self):
         if self._work_dir and os.path.exists(self._work_dir):
             self._logger.info('Erasing working directory...')
@@ -1551,11 +1620,13 @@
             artifacts_uploader=artifacts_uploader,
             package_channel=package_channel,
             verbose=verbose,
         )
         self._tests_dir = CONFIG.tests_base_dir
-        self._ssh_client: Optional[Union[AsyncSSHClient, LongRunSSHClient]] = None
+        self._ssh_client: Optional[Union[AsyncSSHClient, LongRunSSHClient]] = (
+            None
+        )
         self._vm_ip = None
 
     def _wait_for_ssh(self, retries=60):
         ansible = local[self.ansible_binary]
         cmd_args = ('-i', self.ANSIBLE_INVENTORY_FILE, '-m', 'ping', 'all')
@@ -1612,15 +1683,18 @@
     def start_env(self):
         exit_code, stdout, stderr = super().start_env()
         # VM gets its IP address only after deploy.
         # To extract it, the `vm_ip` output should be defined
         # in Terraform main file.
-        ip_exit_code, ip_stdout, ip_stderr = local['terraform'].with_cwd(
-            self._work_dir).run(
-            args=('output', '-raw',  '-no-color', 'vm_ip'),
-            retcode=None,
-            timeout=CONFIG.provision_timeout,
+        ip_exit_code, ip_stdout, ip_stderr = (
+            local['terraform']
+            .with_cwd(self._work_dir)
+            .run(
+                args=('output', '-raw', '-no-color', 'vm_ip'),
+                retcode=None,
+                timeout=CONFIG.provision_timeout,
+            )
         )
         if ip_exit_code != 0:
             error_message = f'Cannot get VM IP: {ip_stderr}'
             self._logger.error(error_message)
             return ip_exit_code, ip_stdout, ip_stderr
@@ -1660,13 +1734,13 @@
         command = ' '.join(args)
         result = self._ssh_client.sync_run_command(command)
         return result.exit_code, result.stdout, result.stderr
 
     def clone_third_party_repo(
-            self,
-            repo_url: str,
-            git_ref: str,
+        self,
+        repo_url: str,
+        git_ref: str,
     ) -> Optional[Path]:
         git_repo_path = super().clone_third_party_repo(repo_url, git_ref)
         if not git_repo_path:
             return
         if self._ssh_client:
@@ -1674,19 +1748,22 @@
                 self._tests_dir,
                 Path(repo_url).name.replace('.git', ''),
             )
             result = None
             for attempt in range(1, 6):
-                cmd = (f'if [ -e {repo_path} ]; then cd {repo_path} && '
-                       f'git reset --hard origin/master && git checkout master && git pull; '
-                       f'else cd {self._tests_dir} && git clone {repo_url}; fi')
+                cmd = (
+                    f'if [ -e {repo_path} ]; then cd {repo_path} && '
+                    f'git reset --hard origin/master && git checkout master && git pull; '
+                    f'else cd {self._tests_dir} && git clone {repo_url}; fi'
+                )
                 result = self._ssh_client.sync_run_command(cmd)
                 if result.is_successful():
                     break
                 self._logger.warning(
                     'Attempt to clone repository on VM failed:\n%s\n%s',
-                    result.stdout, result.stderr,
+                    result.stdout,
+                    result.stderr,
                 )
                 time.sleep(random.randint(5, 10))
             if not result or (result and not result.is_successful()):
                 return
 
@@ -1712,26 +1789,30 @@
         '',
         'Third party tests failed',
         exception_class=ThirdPartyTestError,
     )
     def run_third_party_test(
-            self,
-            executor: Union[AnsibleExecutor, BatsExecutor, CommandExecutor, ShellExecutor],
-            cmd_args: List[str],
-            docker_args: Optional[List[str]] = None,
-            workdir: str = '',
-            artifacts_key: str = '',
-            additional_section_name: str = '',
-            env_vars: Optional[List[str]] = None,
+        self,
+        executor: Union[
+            AnsibleExecutor, BatsExecutor, CommandExecutor, ShellExecutor
+        ],
+        cmd_args: List[str],
+        docker_args: Optional[List[str]] = None,
+        workdir: str = '',
+        artifacts_key: str = '',
+        additional_section_name: str = '',
+        env_vars: Optional[List[str]] = None,
     ):
         result = executor.run(
             cmd_args=cmd_args,
             workdir=workdir,
             env_vars=env_vars,
         )
-        if (self.VM_RESTART_OUTPUT_TRIGGER in result.stdout
-                or self.VM_RESTART_OUTPUT_TRIGGER in result.stderr):
+        if (
+            self.VM_RESTART_OUTPUT_TRIGGER in result.stdout
+            or self.VM_RESTART_OUTPUT_TRIGGER in result.stderr
+        ):
             reboot_result = self.reboot_target()
             if not reboot_result:
                 exit_code = 1
                 stderr = result.stderr + '\n\nReboot failed'
                 return exit_code, result.stdout, stderr

Isort report
--- /code/alts/worker/runners/base.py:before	2024-07-04 13:20:09.370111
+++ /code/alts/worker/runners/base.py:after	2024-07-04 13:20:56.411649
@@ -17,15 +17,15 @@
     Dict,
     List,
     Optional,
-    Union,
     Tuple,
     Type,
+    Union,
 )
 
 from billiard.exceptions import SoftTimeLimitExceeded
 from filelock import FileLock
 from mako.lookup import TemplateLookup
-from plumbum import local, ProcessExecutionError, ProcessTimedOut
+from plumbum import ProcessExecutionError, ProcessTimedOut, local
 
 from alts.shared.exceptions import (
     AbortedTestTask,

Bandit report
Run started:2024-07-04 13:20:57.237158

Test results:
>> Issue: [B108:hardcoded_tmp_directory] Probable insecure usage of temp file/directory.
   Severity: Medium   Confidence: Medium
   CWE: CWE-377 (https://cwe.mitre.org/data/definitions/377.html)
   More Info: https://bandit.readthedocs.io/en/1.7.8/plugins/b108_hardcoded_tmp_directory.html
   Location: ./alts/worker/runners/base.py:74:20
73	)
74	TF_INIT_LOCK_PATH = '/tmp/tf_init_lock'
75	BASE_SYSTEM_INFO_COMMANDS = {

--------------------------------------------------
>> Issue: [B311:blacklist] Standard pseudo-random generators are not suitable for security/cryptographic purposes.
   Severity: Low   Confidence: High
   CWE: CWE-330 (https://cwe.mitre.org/data/definitions/330.html)
   More Info: https://bandit.readthedocs.io/en/1.7.8/blacklists/blacklist_calls.html#b311-random
   Location: ./alts/worker/runners/base.py:592:27
591	                attempts -= 1
592	                time.sleep(random.randint(5, 10))
593	        if attempts == 0 and recorded_exc:

--------------------------------------------------
>> Issue: [B311:blacklist] Standard pseudo-random generators are not suitable for security/cryptographic purposes.
   Severity: Low   Confidence: High
   CWE: CWE-330 (https://cwe.mitre.org/data/definitions/330.html)
   More Info: https://bandit.readthedocs.io/en/1.7.8/blacklists/blacklist_calls.html#b311-random
   Location: ./alts/worker/runners/base.py:1048:27
1047	                self._logger.debug('Sleeping before making another attempt')
1048	                time.sleep(random.randint(5, 10))
1049	            else:

--------------------------------------------------
>> Issue: [B110:try_except_pass] Try, Except, Pass detected.
   Severity: Low   Confidence: High
   CWE: CWE-703 (https://cwe.mitre.org/data/definitions/703.html)
   More Info: https://bandit.readthedocs.io/en/1.7.8/plugins/b110_try_except_pass.html
   Location: ./alts/worker/runners/base.py:1655:12
1654	                self._ssh_client.close()
1655	            except:
1656	                pass
1657	        super().teardown(publish_artifacts=publish_artifacts)

--------------------------------------------------
>> Issue: [B311:blacklist] Standard pseudo-random generators are not suitable for security/cryptographic purposes.
   Severity: Low   Confidence: High
   CWE: CWE-330 (https://cwe.mitre.org/data/definitions/330.html)
   More Info: https://bandit.readthedocs.io/en/1.7.8/blacklists/blacklist_calls.html#b311-random
   Location: ./alts/worker/runners/base.py:1689:27
1688	                )
1689	                time.sleep(random.randint(5, 10))
1690	            if not result or (result and not result.is_successful()):

--------------------------------------------------

Code scanned:
	Total lines of code: 1589
	Total lines skipped (#nosec): 0
	Total potential issues skipped due to specifically being disabled (e.g., #nosec BXXX): 0

Run metrics:
	Total issues (by severity):
		Undefined: 0
		Low: 4
		Medium: 1
		High: 0
	Total issues (by confidence):
		Undefined: 0
		Low: 0
		Medium: 1
		High: 4
Files skipped (0):

View full reports on the Job Summary page.

@Korulag Korulag merged commit 79654d5 into master Jul 4, 2024
2 checks passed
@Korulag Korulag deleted the fix-uninstall-test-2 branch July 4, 2024 13:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Skip uninstall_package test for protected packages and their dependencies
4 participants