Why Choose Us?
0% AI Guarantee
Human-written only.
24/7 Support
Anytime, anywhere.
Plagiarism Free
100% Original.
Expert Tutors
Masters & PhDs.
100% Confidential
Your privacy matters.
On-Time Delivery
Never miss a deadline.
python aws using the lambda function to retrieve cloudtrail logs from s3 bucket and send to elastic search
python aws
using the lambda function to retrieve cloudtrail logs from s3 bucket and send to elastic search. want security groups, EC2 and users logs filtered, need python code to accomplish that. add to lambda function below:
import boto3
import re
import requests
from requests_aws4auth import AWS4Auth
region = 'us-east-2'
service = 'es'
creds = boto3.Session().get_credentials()
awsauth = AWS4Auth(creds.access_key, creds.secret_key, region, service, session_token=creds.token)
host = 'https://search-elastic-elasti-rc10pj9rhd5a-3sdwcyvbeibaowoyeely7hfk7e.us-east-2.es.amazonaws.com/'
index = 'lambda-s3-file-index'
type = 'lambda-type'
url = host + "/" + index + "/" + type
headers = { "Content-Type": "application/json" }
s3 = boto3.client('s3')
pattern_ip = re.compile('(d+.d+.d+.d+)')
pattern_time = re.compile("[(d+/www/dddd:dd:dd:dds-dddd)]")
pattern_msg = re.compile('"(.+)"')
def handler(event, context):
for record in event['Records']:
# Get the bucket name and key for the new file
bucket = record['s3']['bucket']['name']
key = record['s3']['object']['key']
# Get, read, and split the file into lines
obj = s3.get_object(Bucket = bucket, Key = key)
body = obj['Body'].read()
lines = body.splitlines()
# Match the regular expressions to each line and index the JSON
for line in lines:
line = line.decode("utf-8")
ip = ip_pattern.search(line).group(1)
timestamp = time_pattern.search(line).group(1)
message = message_pattern.search(line).group(1)
document = { "ip": ip, "timestamp": timestamp, "message": message }
r = requests.post(url, auth = awsauth, json = document, headers = headers)
Expert Solution
Need this Answer?
This solution is not in the archive yet. Hire an expert to solve it for you.





