Posts

Splunk custom input with session

Splunk can be extended with custom inputs written in Python. In order to connect to Splunk services, code has to be configured and use a session token. Here is a basic setup emitting records with kvstore names #!/usr/bin/env python import splunklib.client as client import sys import datetime as dt def generate(session_key): service = client.connect(token = session_key) for collection in service.kvstore: ts = dt.datetime.now(tz=dt.timezone.utc).isoformat() print(f'{ts}, collection="{collection.name}"') if __name__ == "__main__": session_key = sys.

AWS Account Cleanup

Many AWS customers skip initial planning steps and don’t establish infrastructure-as-code scripting practices initially necessitating querying and cleaning up resources to avoid billing charges. AWS tooling AWS Cost Explorer AWS Billing Console AWS Tag Editor select all regions, all resource types and uncheck tags The open source community developed a number of tools List all AWS resources aws_list_all command line awsls AWSRetriever desktop aws-inventory command line and GUI Filtered delete aws-nuke extensive filtering cloud-nuke filtering by region, age, resourcetype is available

AWS Lambda Lookup AccountId and Region

For a number of uses including generating IAM roles and various policies, lambda code might require access to current account id and region. Here is a code snippet on how to acquire those values: import json import logging import boto3 logger = logging.getLogger() logger.setLevel(logging.INFO) def lambda_handler(event, context): logger.info(json.dumps(event)) account_id = boto3.client('sts').get_caller_identity().get('Account') # use a client which is region based logs_client = boto3.client('logs') region_name = logs_client.meta.region_name logger.info(f"{account_id}:{region_name}")

AWS EC2 Instance Connect

EC2 Instance Connect is somewhat overlooked functionality improving security of EC2 logins. During configuration, the default instance user is assigned a public key so it’s private pair can be used to connect to the instance. The private key tends to be shared within support teams and logins can no longer be attributed to an individual. EC2 instance Connect runs and logs connect commands as the individual user and obeys user’s permission.

AWS S3 Signed Upload URL Basics

While I found a number of examples for generating signed upload S3 URLs, there didn’t seem to be examples with the basics. After substituting the name for your bucket, file name and expiry desired, run below code to generate the URL: import boto3 if __name__ == "__main__": s3_client = boto3.client('s3') response = s3_client.generate_presigned_url( ClientMethod='put_object', Params={"Bucket": "mybucket", "Key": "file.pdf"}, ExpiresIn=48*60*60, HttpMethod="PUT") print(response) To upload from command line run below (substituting URL from previous section):

AWS Cloudformation Referencing AMIs Using SSM Parameter Store

When AWS infrastructure configured in “traditional” compute/storage/network style, identifying, referencing and patching AMIs in all regions in use is crucial. Cloudformation has a way to redirect AMI references through SSM Parameter Store. This represents a tradeoff, as recreating the Cloudformation stack might pickup the next (patched) AMI hence it is no longer immutable. But resulting state is similar to externally patched Linux/Windows images which also cannot be recreated by simply redeploying Cloudformation.

AWS Organizations CLI

Many AWS customers take advantage of AWS Organizations to organize and secure their workloads. In many cases, users login into their master account and configure permissions allowing to switch to member accounts in the Console. The same permissions can be used for AWS CLI. In this example below, a single AWS Access Key has to be generated (and rotated) in the master account and it can be used to switch to test/2222222 and production/3333333 accounts using the CrossAccountAccessRole already configured for switching in the Console.

Login into a CA SSO/Siteminder protected site with Python Requests

While this below code is simple, it uses two important approaches: utilizes a Requests Session to keep Siteminder login cookies/headers it has a two step load, allowing to fill out the Siteminder form import requests if __name__ == "__main__": mysite = 'http://mysite/' credentials = {'USER': 'myuser', 'PASSWORD': 'mypassword'} s = requests.session() # use Session to keep cookies around page = s.get(mysite) s.post(page.url, data=credentials) # page.url is the Siteminder login screen page = s.

AWS SimpleDB Boto3 Example

When looking into AWS SimpleDB, a quick search didn’t return any short Python Boto3 examples. So I decided to post one. (As with any services you to subscribe to, running this code below might cost you money …) from __future__ import print_function import boto3 def quote(string): return string.replace("'", "''").replace('"', '""').replace('`', '``') def put_attributes(sdb, domain, id, color): response = sdb.put_attributes( DomainName=domain, ItemName=id, Attributes=[ { 'Name': 'color', 'Value': color, 'Replace': True }, ], ) print(response) if __name__ == "__main__": domain = "TEST_DOMAIN" sdb = boto3.