When maintaining mulitple servers, its important to keep track of server packages and update as needed. Here is a simple script that will read the vulnerable packages and push it into a Google Sheet. If you have 5 servers, it will create a sheet each inside the spreadsheet, i.e 5 sheets. We will be using Python, pygsheets is the library.
Create a new Google sheet and have the Google SpreadSheet ID ready, lets call it MY_SHEET_ID
Create your Google Service Account Credentials. Refer https://pygsheets.readthedocs.io/en/stable/authorization.html
Download your security JSON, say google-service-credentials.json
Setup this code in a cron and change the identifier to your server name.
gc = pygsheets.authorize(service_file="google-service-credentials.json")
sh = gc.open_by_key('MY_SHEET_ID')
os.system("yum list-security --security > /tmp/security_output.txt")
warnings = open('/tmp/security_output.txt', 'r')
warnData = warnings.readlines()
identifier = "MyServerName"
fieldnames = ['name', 'priority', 'package']
values_list = [fieldnames]
for line in warnData:
data = ' '.join(line.split()).split(" ")
if len(data) == 3:
name = data
priority = data
package = data
values_list.append(name, priority, package)
wks = sh.worksheet_by_title(identifier)
wks = sh.add_worksheet(identifier)
wks.insert_rows(row=0, number=1, values=values_list)
This code will create a new sheet if it does not exist with the identifier name and refresh the list of packages vulnerable. You can drop this code in 5 servers and change the identifier in each server.
From here, you can use the power of Google Sheet to group packages by severity. I hope this small script will help you in some way.
Are you running into high AWS Bill? Checkout my product explained here https://kambanthemaker.com/posts/aws-cost-visualization-and-savings and a basic introduction here by Peter.