Automated security assessment of Amazon Web Services accounts using CIS Benchmark and Python 3⋆ Oleksandr Volotovskyi1,†, Roman Banakh1,*,†, Andrian Piskozub1,† and Zoreslava Brzhevska2,† 1 Lviv Polytechnic National University, 12 Stepana Bandery str., 79013 Lviv, Ukraine 2 Borys Grinchenko Kyiv Metropolitan University, 18/2 Bulvarno-Kudryavska str., 04053 Kyiv, Ukraine Abstract This paper focuses on the security assessment of Amazon Web Services (AWS) accounts using the Center for Internet Security (CIS) benchmarks. Considering the rapid growth of digital technologies and the increasing reliance on cloud services for business and personal use, ensuring the security of data and accounts is paramount. The study aims to analyze and assess the security posture of AWS accounts, emphasizing automating this process through Python 3 while also exploring the application of CIS benchmarks specific to the platform. A thorough examination of existing security evaluation methods and tools is conducted, including practical tests to ensure that AWS accounts comply with CIS benchmark security standards. The paper highlights the benefits of streamlining and enhancing the process to improve overall efficiency by automating the security assessment. The findings offer valuable insights for businesses and individual AWS users, providing practical recommendations to strengthen data security and ensure high confidentiality, integrity, and availability. These recommendations can be a foundation for developing and implementing effective security strategies in cloud environments. Keywords AWS, CIS benchmarks, cloud security, automated security assessment, compliance, account security 1 1. Introduction reliability of cloud environments can be significantly improved [4]. In today’s digital world, where virtual infrastructure is The CIS Benchmark recommendations cover the becoming integral to business and personal life, data and configuration of various AWS services, such as Amazon S3 account security is critical. This is especially true for cloud [5], Amazon EC2 [6], Amazon RDS [7], and others. These platforms such as Amazon Web Services (AWS), which offer guidelines help configure access permissions, ensure a wide range of data storage, processing, and ans. In this effective monitoring and logging of events, and provide context, the issue of assessing the security of AWS accounts automated tools to verify compliance with security becomes increasingly relevant. Although tools and methods standards. Continuous updates in response to new threats for security assessment, such as the CIS Benchmark for and changes in the AWS environment ensure the relevance AWS, play a crucial role in enhancing information security, and effectiveness of security measures. it is equally important to consider comprehensive Assessing AWS account security with CIS Benchmark is frameworks like ISO/IEC 27001:2022 and approaches such a powerful tool for organizations looking to protect their as Secure as Code [1] to address configuration management data and services in the cloud [8, 9]. Using such tools more effectively, as the lack of such comprehensive mitigates risks and builds trust with customers and partners, approaches could lead to significant and potentially enhancing the organization’s reputation in the market. irreversible losses. Implementing the AWS CIS Benchmark is thus a strategic Assessing the security of Amazon Web Services step for any organization that aims to ensure the highest accounts using the Center for Internet Security (CIS) level of security for its cloud resources. benchmarks [2] and automating this process [3] allows for effective monitoring and enhancement of security 2. Measures and tools to improve measures. By utilizing existing methods and tools for security evaluation, studying the CIS benchmark security in AWS recommendations for AWS, conducting practical tests, and The AWS CIS Benchmark is a set of recommendations and verifying account compliance with security standards, the guidelines for setting up security in an Amazon Web CPITS-II 2024: Workshop on Cybersecurity Providing in Information 0009-0003-3102-3694 (O. Volotovskyi); and Telecommunication Systems II, October 26, 2024, Kyiv, Ukraine 0000-0001-6897-8206 (R. Banakh); ∗ Corresponding author. 0000-0002-3582-2835 (A. Piskozub); † These authors contributed equally. 0000-0002-7029-9525 (Z. Brzhevska) oleksandr.volotovskyi.kb.2020@lpnu.ua (O. Volotovskyi); © 2024 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0). roman.i.banakh@lpnu.ua (R. Banakh); andrian.z.piskozub@lpnu.ua (A. Piskozub); z.brzhevska@kubg.edu.ua (Z. Brzhevska) CEUR Workshop ceur-ws.org ISSN 1613-0073 363 Proceedings Services (AWS) environment. CIS (Center for Internet permissions necessary to perform their work. This approach Security) is a non-profit organization specializing in allows you to control permissions and reduces risks. developing standards and methods for ensuring information The third way is temporary access through IAM roles. technology security. The policy may be customized by adding conditions such as The AWS CIS Benchmark consists of recommendations IP addresses to define a secure process between the and guidelines to help organizations ensure a high level of application and S3 storage through IAM roles. This ensures security for their accounts, resources, and services in the that access to data is temporary and limited. AWS environment. This set includes recommendations for To prevent inappropriate permissions and privileges in configuring various AWS services, setting up access rights, AWS, it is essential to proactively manage identity and monitoring, logging, and other security aspects. access rights by configuring user permissions according to Key features of the AWS CIS Benchmark: their roles and responsibilities. It is worth using an identity and access management  Security Standards: The recommendations define (IAM) provider that allows you to assign permissions to security standards for various AWS services, each user or group of users. To increase the effectiveness of including Amazon S3, Amazon EC2, Amazon RDS, permissions management, it is necessary to regularly review and others. all users with higher privileges and update their permissions  Security Recommendations: The CIS Benchmark to match their current roles and responsibilities. This will provides detailed recommendations for securely help avoid unauthorized use of permissions and ensure configuring AWS services and resources. compliance with the principle of least privilege.  Automated testing: The recommendations can be used for automated security testing of an AWS environment to detect security breaches and 3.1.2. Using MFA and AWS secrets manager to compliance. improve accounts’ security  Updates: CIS regularly updates its To protect yourself from losing your AWS account, you recommendations to reflect changes in the AWS should use multi-factor authentication [10] to log in to your environment and evolving security threats. account. This will provide an additional layer of security Openness and community: CIS Benchmark is an open and make it harder for an attacker to take over your standard, and all its recommendations are available for the accounts, even if the data is compromised. community and third-party developers. It is also important to constantly monitor attempts to log in to your accounts to detect possible intrusion attempts in time. 3. Compliance achievement with For more reliable control and security of credentials, AWS Services you can use AWS Secrets Manager, which provides the Although AWS was created as a platform for providing ability to rotate credentials and store them in a stable virtual machine services, today, this provider offers environment. This method will limit the risk of credential hundreds of different services. Since there are many services theft and misuse of the infrastructure. and the account owner can add many users to this account, monitoring user activity is a natural need. Therefore, AWS 3.2. Using AWS services for monitoring and pays great attention to services for the security of user logging accounts and their monitoring. In this discussion, we pay In this section, we will use CloudTrail [11] to monitor and attention to such services. audit account activity, AWS Config [12] for automated configuration management and compliance, and AWS 3.1. Using AWS services to improve GuardDuty to detect potential threats to the infrastructure. accounts’ security Using these services allows you to maintain a high level of There are a couple of essential services that allow owners to security and respond to possible security threats on time. keep accounts safe. If you neglect them, you can lose access to the account, which in turn can lead to reputational and 3.2.1. Using the CloudTrail service for financial losses. monitoring and logging Enabling container access logging can prevent undetected 3.1.1. Using the IAM service to improve S3 storage request events. It is important to note that S3 accounts’ security storage does not create logs by default, so it is essential to There are several ways to provide unlimited and long-term enable this feature. From then on, the S3 bucket will log all access to AWS S3 storage. types of requests they receive. In addition, they will log the The first way is to set access rules to the data in the time of each request. storage. Also, the number of people with access to S3 Using access logs speeds up the process of detecting and storage, even for senior management, should be limited if responding to unexpected activity. there is no critical need for this. In addition, you should use Amazon CloudTrail. This The second way is to use the least privilege rule. The service allows you to track and log every API call to your Identity and Access Management (IAM) service allows you AWS account. to restrict access to S3 storage with the proper settings. Logs contain essential information such as IP addresses, Thus, users and programs are granted only the minimum request execution time, and types of interactions. 364 Monitoring logs allow for the detection of dangerous or untrusted resources. WAF effectively filters traffic, unusual activities in time. preventing attacks and prohibited access to AWS resources. This detection process is essential for preventing cyber However, it’s important to remember that installing a threats and security breaches. CloudTrail makes it easy to WAF alone doesn’t guarantee complete protection. To be receive notifications of security events, such as root logins, more effective, you should combine WAF with other and receiving these notifications speeds up the response to security measures, such as user identification and potential risks. authentication, network security measures, regular security audits, and staff training on the latest threats and security 3.2.2. Use AWS Config for configuration practices. You should also keep your WAF rules up to date management and compliance and analyze traffic to identify new threats and attacks. AWS Config allows you to evaluate, verify, and control the Additional security can be provided through network configuration of resources in the AWS infrastructure [13]. access to control lists that manage the entry and exit of site It also allows you to perform actions such as change visitors from the subnet. For example, setting up security logging, compliance assessment, configuration tracking, rules in a NACL denies access to specific ports or IP and change history. addresses. Thus, by frequently checking and updating the AWS Config logs every resource configuration change, rules, you can avoid threats and have a higher level of including access and security policies. This allows you to protection. respond to any changes quickly and helps identify possible 3.3.2. Using AWS Shield to protect against security issues. DDoS attacks Moreover, AWS Config allows you to create rules that automatically evaluate resource configurations against AWS Shield is an integral part of the infrastructure for defined security policies and standards. These rules can protecting [19] against DDoS attacks. AWS Shield helps include checking data encryption, configuration settings, ensure the stability of applications and websites in the AWS and more. environment. The main focus of AWS Shield is to protect Configuration history shows all the changes to against various types of DDoS attacks, including parser resources over a particular time. Thus, configuration history attacks at Layer 7 and attacks at Layers 3 and 4. allows you to analyze the causes of possible configuration This service automatically detects attacks, responds problems or failures. quickly, and mitigates their impact on systems. In addition, Also, notifications through Amazon SNS [14] allow you AWS Shield integrates with other AWS security services, to receive information about configuration changes and including AWS WAF or Web Application Firewall, to inconsistencies in real-time, allowing you to respond provide an advanced level of protection. quickly to potential problems. In addition to the standard level of protection, there is an extended version: AWS Shield Advanced. This paid plan 3.2.3. Using AWS GuardDuty for continuous provides additional features, such as protection against threat monitoring sophisticated and large-scale attacks. AWS GuardDuty is a service designed to analyze event logs, 3.3.3. Use AWS Security Hub for centralized network [15] traffic, and other data sources hosted by AWS security management to detect unusual or suspicious activity. In addition, GuardDuty uses machine learning and artificial intelligence AWS Security Hub is a centralized service for security algorithms to identify potential security threats. control and monitoring of a customer’s AWS infrastructure. The system can analyze numerous activities, such as It offers security incident detection, automated notification unusual external traffic, suspicious intrusion attempts, processing, and integration with other security tools. changes in security system configuration, etc., to identify AWS Security Hub processes data from many sources, potential threats. Once such threats are detected, including AWS CloudTrail, AWS Config, Amazon GuardDuty sends alerts and event reports, allowing security GuardDuty, and many others, and then provides a single operators to respond immediately to potential problems. view of an AWS user account's security status. Using the AWS Security Hub, you can notice potential 3.3. Using AWS services to protect traffic security threats, such as unusual or suspicious activity, non- and resources compliance with security requirements, and many other vulnerabilities. When such incidents are detected, Security This section will cover the use of Web Application Firewall Hub can send alerts and provide recommendations on how (WAF [16]) and Network Access Control Lists (NACL) to to resolve them. filter traffic, protect against distributed denial of service AWS Security Hub centralizes and automates AWS (DDoS [17]) attacks with AWS Shield, and the role of AWS security management, enabling you to identify and respond Security Hub [18] in centralized security management. to potential security threats quickly. This service helps ensure high security for infrastructure and data in the AWS 3.3.1. Use WAF and NACL to filter traffic and cloud environment. improve security It would help if users used a Web Application Firewall (WAF) to protect AWS from unfiltered traffic from 365 4. Security issues in Amazon Web details such as the type of request, the resource used for the request, and date and time stamps. Having access logs helps Services you assess AWS security risks by tracking requests and Poor security in Amazon Web Services (AWS) is a recognizing the type of requests made. Access logs enable widespread problem that exposes companies and you to assess AWS security risks by monitoring requests enterprises to high risks. Issues that undermine the and recognizing the type of requests made. integrity, confidentiality, and availability of data and An AWS security audit would be a great approach to resources hosted in an AWS environment can mainly result identify such misconfigurations. in this. Incorrect configurations are often the cause of AWS security breaches. Configuration errors related to various 5.3. Unfiltered traffic from unreliable AWS services, such as security groups or Simple Storage sources Service (S3) storage, can easily lead to the leakage of When traffic to the AWS instances or load balancers is confidential information or unauthorized access that was unrestricted, attackers can obtain information about the not intended in any way. These mistakes can result from application to attack. To avoid this, you must restrict access oversight, incompetence, or failure to follow the security to instances and control traffic. rules set by AWS. DDoS attacks are possible without proper network Another reason is that we need more visibility into configuration and can quickly overwhelm the system. security in the AWS environment. Monitoring all assets in Restricting traffic from suspicious sources will reduce risks large infrastructures around the clock to capture such and reduce the attack surface. incidents is difficult. Hackers can only go undetected with Security groups that function as a firewall allow only adequate monitoring and logging systems once they cause authorized traffic. They only allow access from specific IP damage. addresses or ranges. A Network Access Control List (NACL) provides an additional layer of security for subnets. Users 5. Threats to AWS services must ensure that the NACL does not allow access from all This section describes the security threats associated with IP addresses or ports and creates new restrictive rules. using Amazon S3 and AWS. Particularly, it discusses the issues of unlimited and long-term access to S3 buckets, 6. Automated assessment of which can lead to data leakage. Undetected request events compliance with CIS Benchmark to S3 buckets make it challenging to detect unauthorized access. controls 6.1. Identity and access management 5.1. Unlimited and long-lasting access to S3 section buckets The code is implemented in Python to check and collect Unlimited and prolonged access to S3 buckets can create information about the security of accounts in AWS Identity vulnerabilities. S3 (Simple Storage Service) allows you to and Access Management (IAM). It uses the boto3 [20] and store data that is easy and secure to access. The data is pytz libraries to interact with AWS services and work with uploaded to several data centers in a selected region and data. It checks various aspects by the CIS benchmark stored with backups. C3 buckets can be vulnerable if they controls of the Identity and Access Management section. provide uncontrolled access to all users. Attackers can use The results of the checks are saved in a JSON file. read/write accounts to encrypt essential documents, change First, we need to import a few important libraries that settings, or install malware. Therefore, it is crucial to will be used in our script: manage permissions for access to buckets. Permissions can include editing, viewing, uploading/deleting, and list  boto3: This is the core AWS SDK library for viewing. Reviewing permissions helps reduce AWS security Python that allows you to interact with AWS risks. Using temporary access through IAM Roles is services, specifically S3. recommended by creating particular policies with  json: Used to work with JSON data, in which we conditions, such as IP addresses. This allows you to ensure will store the results of the check. a secure interaction process between your application and  subprocess: Allows you to execute system S3 buckets. commands through the shell, which is necessary for some specific queries. 5.2. Unprotected request events to S3  xml.etree.ElementTree: A standard library for processing XML. (In our case, it is not used Buckets directly, but may be needed for future S3 Buckets can be a target for data theft because they integrations.) process objects and store application files. Cyberattacks that The first check we will perform is to evaluate the encryption lead to data breaches consist of countless requests to access of data in the S3 bucket. According to the CIS Benchmark, the data in these buckets. Without logs of these requests, all buckets must be encrypted using the AES-256 algorithm. they go undetected until it’s too late. To do this, we use the check_s3_bucket_encryption() S3 Buckets do not generate logs by default, so this function. It calls the S3 API and checks whether encryption feature must be enabled manually. Once enabled, S3 Buckets is enabled and whether AES-256 is used. will create access logs for any request made to them, with 366 import boto3 requirement to use only a secure connection, the function from botocore.exceptions import ClientError returns True, otherwise, it returns False. Another important recommendation is to enable def check_s3_bucket_encryption(bucket_name): s3_client = boto3.client('s3') versioning of the batch and additional protection with MFA try: (Multi-Factor Authentication) [23]. Versioning helps to save encryption_response = all changes made to files, and MFA protects against s3_client.get_bucket_encryption(Bucket=bucket_name) accidental or malicious deletions. encryption_configuration = encryption_response.get('ServerSideEncryptionConfigurati on', {}) import boto3 from botocore.exceptions import ClientError sse_algorithm = encryption_configuration.get('Rules', [{}])[0]\ def check_bucket_versioning_mfa(bucket_name): .get('ApplyServerSideEncryptionByDefault', {})\ s3_client = boto3.client('s3') .get('SSEAlgorithm', '') try: versioning_response = return sse_algorithm in ['AES256', 'aws:kms'] s3_client.get_bucket_versioning(Bucket=bucket_name) except s3_client.exceptions.NoSuchBucketEncryption: versioning_status = versioning_response.get('Status', print(f"Bucket '{bucket_name}' does not have 'Disabled') encryption configured.") if versioning_status == 'Enabled': return False mfa_delete_status = except ClientError as e: versioning_response.get('MFADelete', 'Disabled') print(f"Error checking S3 bucket encryption for return mfa_delete_status == 'Enabled' {bucket_name}: {e}") else: return False return False except ClientError as e: The function makes a request to the S3 API to get the print(f"Error checking S3 bucket versioning and encryption configuration of the bucket. If the bucket is MFADelete: {e}") encrypted with AES-256, the function returns True. return False Otherwise, it returns False. In case of an error (for example, if encryption is not configured or the batch does not exist), The function checks whether versioning is enabled for a a corresponding message is displayed. particular batch. If versioning is enabled, it also checks The next step is to make sure that all traffic to the S3 whether MFA Delete is enabled. Returns True if both bucket is transmitted over a secure connection features are enabled, or False otherwise. (SecureTransport) [21]. To do this, we use a system The last check concerns the public access blocking command through the subprocess library that searches the settings. It is important to ensure that S3 buckets are not bucket policy [22] for the requirement to use HTTPS. publicly accessible unless it is a conscious choice. To do this, we use the check_public_access_block() function. import boto3 import json import boto3 from botocore.exceptions import ClientError from botocore.exceptions import ClientError def check_secure_transport(bucket_name): def check_s3_public_access_block(bucket_name): s3 = boto3.client('s3') s3_client = boto3.client('s3') try: try: response = access_block_response = s3.get_bucket_policy(Bucket=bucket_name) s3_client.get_public_access_block(Bucket=bucket_name) policy = json.loads(response['Policy']) config = access_block_response.get('PublicAccessBlockConfiguratio 'aws:SecureTransport' n', {}) for statement in policy.get("Statement", []): if "Condition" in statement and "Bool" in block_public_acls = config.get('BlockPublicAcls', statement["Condition"]: False) if "aws:SecureTransport" in ignore_public_acls = config.get('IgnorePublicAcls', statement["Condition"]["Bool"]: False) if block_public_policy = config.get('BlockPublicPolicy', statement["Condition"]["Bool"]["aws:SecureTransport"] == False) "true": restrict_public_buckets = return True config.get('RestrictPublicBuckets', False) return False return block_public_acls and ignore_public_acls and except ClientError as e: block_public_policy and restrict_public_buckets print(f"Error checking secure transport for bucket {bucket_name}: {e}") except ClientError as e: return False print(f"Error checking public access block for bucket {bucket_name}: {e}") The aws s3api get-bucket-policy command is used to return False retrieve an S3 bucket policy that is checked for the presence of a SecureTransport key. If the policy contains a 367 6.3. Relational database service section The function checks the Public Access Block configuration to ensure that all policies that block public access are This code snippet implements the verification of some enabled. Returns True if all these options are enabled. security [25] aspects of the RDS (Relational Database Service) database in AWS in different regions. It checks 6.2. Elastic Compute Cloud (EC2) section whether the data storage is encrypted, whether automatic updates of minor versions of RDS are enabled, and whether This code snippet implements checking the default the databases are available for public access. encryption settings for EBS (Elastic Block Store) [24] objects The function checks whether encryption is enabled for in different AWS regions. each database across all AWS regions. Using the AWS API for each EC2 region, it checks whether the default encryption for EBS is set in each of import boto3 them. This allows you to ensure that security settings are from botocore.exceptions import ClientError consistent across all regions where AWS infrastructure is def check_rds_storage_encryption(): used. results = {} This function checks whether EBS encryption is enabled by default in the specified region. try: ec2_client = boto3.client('ec2') import boto3 regions = [region['RegionName'] for region in from botocore.exceptions import ClientError ec2_client.describe_regions()['Regions']] def check_ebs_encryption_by_default(region): for region in regions: try: rds_client = boto3.client('rds', region_name=region) ec2_client = boto3.client('ec2', region_name=region) try: response = db_instances = ec2_client.get_ebs_encryption_by_default() rds_client.describe_db_instances()['DBInstances'] return response.get('EbsEncryptionByDefault', False) except ClientError as e: for db_instance in db_instances: print(f"Error checking EBS encryption by default for db_instance_identifier = region {region}: {e}") db_instance['DBInstanceIdentifier'] return False storage_encrypted = db_instance.get('StorageEncrypted', False) The nascent function collects all AWS regions, checks the if region not in results: default EBS encryption status in each region, and saves the results[region] = {} results. results[region][db_instance_identifier] = {"StorageEncrypted": storage_encrypted} import boto3 except ClientError as e: from botocore.exceptions import ClientError print(f"Error describing DB instances in region {region}: {e}") def check_ebs_encryption_by_default(region): try: except ClientError as e: ec2_client = boto3.client('ec2', region_name=region) print(f"Error describing regions: {e}") response = ec2_client.get_ebs_encryption_by_default() return results return response.get('EbsEncryptionByDefault', False) except ClientError as e: The following function checks whether the database is print(f"Error checking EBS encryption by default for publicly available. region {region}: {e}") return False import boto3 def write_results_to_file(results): from botocore.exceptions import ClientError with open('ebs_encryption_results.txt', 'w') as file: for region, is_encrypted in results.items(): def check_rds_publicly_accessible(): file.write(f"{region}: {'Enabled' if is_encrypted else results = {} 'Disabled'}\n") try: def main(): ec2_client = boto3.client('ec2') try: regions = [region['RegionName'] for region in ec2_client = boto3.client('ec2') ec2_client.describe_regions()['Regions']] ec2_regions = [region['RegionName'] for region in ec2_client.describe_regions()['Regions']] for region_name in regions: results = {} rds_client = boto3.client('rds', for region in ec2_regions: region_name=region_name) result = check_ebs_encryption_by_default(region) try: results[region] = result write_results_to_file(results) db_instances = except ClientError as e: rds_client.describe_db_instances()['DBInstances'] print(f"Error: {e}") for db_instance in db_instances: if __name__ == "__main__": db_instance_identifier = db_instance['DBInstanceIdentifier'] main() publicly_accessible = 368 db_instance.get('PubliclyAccessible', False) The function checks if event log encryption is enabled using AWS KMS. if region_name not in results: results[region_name] = {} import boto3 results[region_name][db_instance_identifier] from botocore.exceptions import ClientError = {"PubliclyAccessible": publicly_accessible} except ClientError as e: def check_cloudtrail_sse_kms(): print(f"Error describing DB instances in region cloudtrail_client = boto3.client('cloudtrail') {region_name}: {e}") try: except ClientError as e: response = cloudtrail_client.describe_trails() print(f"Error describing regions: {e}") trails = response.get('trailList', []) return results for trail in trails: kms_key_id = trail.get('KmsKeyId') 6.4. Logging section if kms_key_id: return True This code snippet implements the verification of compliance return False with various security requirements and settings in the CloudTrail service, which provides event logging in AWS. except ClientError as e: print(f"Error describing trails: {e}") It checks the presence and status of various components, return False such as event logging, the inclusion of various types of events, the time of the last log delivery to CloudWatch, the The script checks whether automatic encryption key status of the configuration logger, encryption and KMS [26] rotation is enabled to improve security. key settings, KMS key rotation, and others. The function determines whether logging is enabled for import boto3 each CloudTrail route. from botocore.exceptions import ClientError def get_kms_key_id(): import boto3 from botocore.exceptions import ClientError return 'your-kms-key-id' def describe_trails(): def check_kms_key_rotation(): cloudtrail_client = boto3.client('cloudtrail') kms_client = boto3.client('kms') try: key_id = get_kms_key_id() response = cloudtrail_client.describe_trails() try: return response.get('trailList', []) response = except ClientError as e: kms_client.get_key_rotation_status(KeyId=key_id) print(f"Error describing trails: {e}") rotation_enabled = response.get('KeyRotationEnabled', return [] False) return rotation_enabled if __name__ == "__main__": trails = describe_trails() except ClientError as e: if trails: print(f"Error getting key rotation status: {e}") for trail in trails: return False print(trail) else: print("No trails found.") 6.5. Networking section The script checks whether CloudTrail uses event This code snippet implements checking compliance with logging on S3, which allows event auditing. various security aspects in the AWS environment. It checks access to the network access control lists (ACLs) [27] and import boto3 security groups (SGs) [28] for the corresponding ports (22, from botocore.exceptions import ClientError 3389) from any IP address, checks access to security groups for IPv6, checks for restrictions in the default offline def check_cloudtrail_s3_logging(): security group, and checks the routing tables for routing cloudtrail_client = boto3.client('cloudtrail') try: rules for the special subnet. This feature checks whether the network ACLs in the response = cloudtrail_client.describe_trails() specified region allow unrestricted access to ports 22 (SSH) trails = response.get('trailList', []) and 3389 (RDP). for trail in trails: logging_s3_enabled = trail.get('S3BucketName') is not None if logging_s3_enabled: return True return False except ClientError as e: print(f"Error describing trails: {e}") return False 369 import boto3 from botocore.exceptions import ClientError 7. Conclusions The following conclusions and results were reached from def check_network_acl_access(region): ec2 = boto3.client('ec2', region_name=region) analyzing and assessing the security of Amazon Web try: Services (AWS) accounts using the CIS (Center for Internet response = ec2.describe_network_acls() Security) benchmark standards. for acl in response.get('NetworkAcls', []): First, we reviewed the existing methods and tools for for entry in acl.get('Entries', []): assessing AWS account security in detail. Studying tools if ('PortRange' in entry and entry.get('CidrBlock') == '0.0.0.0/0' and such as AWS Config, AWS Security Hub, and specialized entry.get('PortRange', {}).get('From') in [22, solutions from third-party vendors allowed us to form a 3389] and holistic view of the capabilities and limitations of different entry.get('RuleAction') == 'allow'): approaches to ensuring security in cloud environments. return False return True Secondly, an in-depth study of the CIS Benchmark recommendations for AWS revealed critical security except ClientError as e: settings for various AWS services, including Amazon S3, print(f"Error describing network ACLs in region Amazon EC2, Amazon RDS, and others. The {region}: {e}") return False recommendations cover a wide range of settings, such as access control, event monitoring, and logging of user This feature checks to see if the default security group for a actions, allowing for comprehensive cloud resource VPC [29] in the region has open rules. security. Practical tests and an assessment of AWS accounts’ import boto3 compliance with CIS Benchmark security standards have from botocore.exceptions import ClientError confirmed the effectiveness of implementing these def check_default_security_group(region, vpc_id): recommendations. In particular, automating the security ec2 = boto3.client('ec2', region_name=region) assessment process using tools that integrate with AWS has try: significantly increased the efficiency and speed of response = ec2.describe_security_groups( identifying and fixing potential vulnerabilities. Filters=[ {'Name': 'vpc-id', 'Values': [vpc_id]}, {'Name': 'group-name', 'Values': ['default']} References ] ) [1] O. Vakhula, et al., Security as Code Concept for Fulfilling ISO/IEC 27001: 2022 Requirements, in: for group in response.get('SecurityGroups', []): Cybersecurity Providing in Information and if group.get('IpPermissions'): return False Telecommunication Systems, vol. 3654 (2024) 59–72. return True [2] CIS AWS Benchmark v1.5.0. URL: https://www.scribd.com/document/624550364/CIS- except ClientError as e: Amazon-Web-Services-Foundations-Benchmark-v1- print(f"Error describing security groups in region 5-0 {region}: {e}") return False [3] Automated Approach to Evaluation and Security of AWS Services using Python and “CIS Benchmark”, in: This feature verifies that the VPC routing tables are 2nd International Scientific Conference (2024) 141–142. properly configured, including peering connections. [4] V. Shapoval, et al., Automation of Data Management Processes in Cloud Storage, in: Workshop on import boto3 Cybersecurity Providing in Information and from botocore.exceptions import ClientError Telecommunication Systems, CPITS, vol. 3654 (2024) def check_route_tables(region, vpc_id, 410–418. peering_connection_id, desired_cidr_block): [5] Amazon S3. URL: https://aws.amazon.com/s3/ ec2 = boto3.client('ec2', region_name=region) [6] Amazon EC2. URL: https://aws.amazon.com/ec2/ try: [7] Amazon Relational Database Service. URL: response = ec2.describe_route_tables(Filters=[{'Name': 'vpc-id', 'Values': [vpc_id]}]) https://aws.amazon.com/rds/ for route_table in response.get('RouteTables', []): [8] Practical Aspects of Using Fully Homomorphic for route in route_table.get('Routes', []): Encryption Systems to Protect Cloud Computing | P. if route.get('GatewayId') == Anakhov, et al., Evaluation Method of the Physical peering_connection_id and route.get('DestinationCidrBlock') == desired_cidr_block: Compatibility of Equipment in a Hybrid Information return False Transmission Network, Journal of Theoretical and return True Applied Information Technology 100(22) (2022) 6635– 6644. except ClientError as e: print(f"Error describing route tables in region {region}: [9] V. Zhebka, et al., Optimization of Machine Learning {e}") Method to Improve the Management Efficiency of return False Heterogeneous Telecommunication Network, in: Workshop on Cybersecurity Providing in Information 370 and Telecommunication Systems, vol. 3288 (2022) [29] Amazon Virtual Private Cloud (VPC). URL: 149–155. https://docs.aws.amazon.com/toolkit-for-visual- [10] D. Shevchuk, et al., Designing Secured Services for studio/latest/user-guide/vpc-tkv.html Authentication, Authorization, and Accounting of Users, in: Cybersecurity Providing in Information and Telecommunication Systems II, vol. 3550 (2023) 217– 225. [11] CloudTrail. URL: https://aws.amazon.com/cloudtrail/ [12] AWS Config. URL: https://aws.amazon.com/config/ [13] V. Khoma, et al., Comprehensive Approach for Developing an Enterprise Cloud Infrastructure, in: Cybersecurity Providing in Information and Telecommunication Systems, vol. 3654 (2024) 201– 215. [14] Amazon Simple Notification Service. URL: https://aws.amazon.com/sns/ [15] R. Banakh, A. Piskozub, Y. Stefinko, External Elements of Honeypot for Wireless Network, in: 13th International Conference on Modern Problems of Radio Engineering, Telecommunications and Computer Science (TCSET) (2016) 480–482. doi: 10.1109/TCSET.2016.7452093. [16] AWS WAF. URL: https://aws.amazon.com/waf/ [17] DDoS Attack. URL: https://aws.amazon.com/ shield/ddos-attack-protection/ [18] AWS Security Hub. URL: https://aws.amazon.com/ security-hub/ [19] P. Anakhov, et al., Protecting Objects of Critical Information Infrastructure from Wartime Cyber Attacks by Decentralizing the Telecommunications Network, in: Cybersecurity Providing in Information and Telecommunication Systems, vol. 3550 (2023) 240–245. [20] Python Bibliotheca Boto3. URL: https://aws.amazon.com/sdk-for-python/ [21] SecureTransport End-User API v1.4 Documentation. URL: https://www.postman.com/api-evangelist/ axway/documentation/x04b0lo/securetransport-end- user-api-v1-4 [22] O. Deineka, et al., Designing Data Classification and Secure Store Policy According to SOC 2 Type II, in: Cybersecurity Providing in Information and Telecommunication Systems, vol. 3654 (2024) 398– 409. [23] Multi-Factor Authentication (MFA). URL: https://aws.amazon.com/iam/features/mfa/ [24] Amazon EBS Documentation. URL: https://docs.aws.amazon.com/ebs/ [25] Y. Martseniuk, et al., Automated Conformity Verification Concept for Cloud Security, in: Cybersecurity Providing in Information and Telecommunication Systems, vol. 3654 (2024) 25–37. [26] Getting Started with AWS Key Management Service. URL: https://aws.amazon.com/kms/getting-started/ [27] Access Control List (ACL) Overview. URL: https://docs.aws.amazon.com/AmazonS3/latest/userg uide/acl-overview.html [28] Find Security Group (SG) IDs, AMS. URL: https://docs.aws.amazon.com/managedservices/latest /userguide/find-SGs.html 371