Skip to main content

One post tagged with "CloudWatch to OpenSearch"

View All Tags

How to Export AWS CloudWatch Logs to OpenSearch (Elasticsearch): Step-by-Step Tutorial

· 6 min read

When managing large volumes of AWS CloudWatch logs, the built-in tools may fall short in terms of searchability and cost-effectiveness. A more efficient approach is to export these logs to OpenSearch (formerly Elasticsearch), which offers better performance for log management and analysis. This blog will walk you through the entire process.

Step 1: Create an OpenSearch Cluster

Navigate to AWS Console: Navigate to the OpenSearch Service. Create a Domain:

Domain

  1. Click on “Create Domain.”
    1. Choose “Standard Create” instead of “Easy Create” to avoid unnecessary nodes.

Select Domain Configuration:

  1. For a demo, select “Dev/Test.”
    Dev or Test

  2. Choose a general-purpose instance type like the M5 family for performance.

    1. Choose the storage settings: gp3 and size (e.g., 10 GB).
    2. Select “Public Access” temporarily for easier setup (but always use VPC access in production).

Set Security Settings:

  1. Set up an IAM master user for OpenSearch.
  2. Assign a tag to the domain to make it easier to track.

Create the Domain: Wait for the domain to become active (this can take up to 30 minutes).

Step 2: Configure CloudWatch Log Group

Navigate to CloudWatch Logs:
CloudWatch Logs

  1. Select the log group you want to export. Create a Subscription Filter:

subscription filter

  1. Choose “OpenSearch” as the destination.
    1. Link the subscription filter to your OpenSearch domain created earlier.

Step 3: Create a Lambda Role

IAM Console

Navigate to IAM Console:

Role for Lambda

  1. Create a role for Lambda.
    AWS Lambda service

  2. Choose the AWS Lambda service. Attach VPC and OpenSearch Permissions:

    1. Add AWSLambdaVPCAccessExecutionRole policy . Add Inline Policy:

Inline Policy

  1. Create an inline policy allowing OpenSearch access (index permissions, bulk data access, etc.).

Following policy shows example, where it is recommended to narrow the policy from es:* only to required permission.

{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "OpensearchAccess",
"Effect": "Allow",
"Action": "es:*",
"Resource": "arn:aws:es:us-east-1:<account-id>:domain/<cluster-name>"
}
]
}

Save the Role: Ensure the role is configured properly with access to both OpenSearch and CloudWatch logs.

Step 4: Edit the Lambda Function

Edit the Lambda Code:

  1. Update the code to handle exporting CloudWatch logs to specific OpenSearch indices. Modify the default index name and ensure it handles the log data properly.

Managing Users and Permissions

Modify Lambda for Multiple Log Groups:

  1. If you have multiple log groups, modify the Lambda function code to create separate indices for each log group.
  2. Update the code to check the log group name and route it to a specific index in OpenSearch. Deploy the Updates: Ensure the Lambda function properly handles log exports from multiple log groups.

In function transform (payload) around line 52, update to add additional log groups or change the index name, as shown:

function transform(payload) {
if (payload.messageType === 'CONTROL_MESSAGE') {
return null;
}

var bulkRequestBody = '';

payload.logEvents.forEach(function(logEvent) {
var timestamp = new Date(1 * logEvent.timestamp);

// Initialize indexName variable
var indexName;

// index name format: cwl-YYYY.MM.DD
if (payload.logGroup === '<log-group-1>') {
indexName = [
'cw-log-group-1-' + timestamp.getUTCFullYear(), // year
('0' + (timestamp.getUTCMonth() + 1)).slice(-2), // month
('0' + timestamp.getUTCDate()).slice(-2) // day
].join('.');

}

else {
// index name format: cw-trig-YYYY.MM.DD
indexName = [
'cw-all-other-logs-' + timestamp.getUTCFullYear(), // year
('0' + (timestamp.getUTCMonth() + 1)).slice(-2), // month
('0' + timestamp.getUTCDate()).slice(-2) // day
].join('.');
}

// var source = buildSource(logEvent.message, logEvent.extractedFields);
var source = {};
source['@id'] = logEvent.id;
source['@timestamp'] = new Date(1 * logEvent.timestamp).toISOString();
source['@message'] = logEvent.message;
source['@owner'] = payload.owner;
source['@log_group'] = payload.logGroup;
source['@log_stream'] = payload.logStream;

var action = { "index": {} };
action.index._index = indexName;
action.index._id = logEvent.id;

bulkRequestBody += [
JSON.stringify(action),
JSON.stringify(source),
].join('\n') + '\n';
});
return bulkRequestBody;
}

Deploy the Function: Once the code is set up, deploy it and link it to your log group subscription.

Step 5: Debugging and Monitoring

Monitor Logs:

  1. In the CloudWatch Logs section, monitor the log groups for any issues.
  2. Open the OpenSearch dashboard and check if the logs are properly flowing into the indices.

Handle Common Errors:

  1. If no logs appear, check the Lambda function for errors. Where in line 12, you can update following line:
var logFailedResponses = false;

to

var logFailedResponses = true;
  1. Ensure that the Lambda role has the correct permissions and that the OpenSearch cluster is properly configured.

Steps for Dashboard

Step 1: Set Up OpenSearch Users and Roles

Create OpenSearch Users:

security section

  1. In the OpenSearch dashboard, navigate to the security section.
    1. Create a user for accessing logs.

Assign Roles:

Assign Roles

  1. Create roles with index-level permissions (e.g., read-only, bulk, and monitoring).

Map

  1. Map the roles to the created users.

Test User Access:

  1. Log in as the user and ensure they can only access their assigned indices.

Step 2: Test and Monitor

Run Test Queries:

Dev Tools

  1. Use OpenSearch Dev Tools to run queries against the imported log data.
    1. Verify that the logs are properly indexed and searchable.

Monitor Cluster Health:

CloudWatch Logs

Check the OpenSearch cluster status and ensure it remains healthy. -1. Add more nodes if necessary, especially in production environments, to ensure high availability and better performance.

Conclusion

Exporting CloudWatch logs to OpenSearch offers improved search capabilities and better scalability. By following these steps, you can set up an efficient logging infrastructure that allows you to analyze large volumes of log data with ease.

Please reach out to us for any of your cloud requirements

Ready to take your cloud infrastructure to the next level? Please Contact Us