Certified: January 21, 2026
Solution Summary
The Cloud Administration Event Log API is a REST-based web services interface that allows administration audit log events to be retrieved from Cloud Access Service (CAS). You can use this REST API to download admin logs to a Cloud Administration Event Log API client software.
Configuration Summary
This guide provides instructions on how to integrate Microsoft Sentinel with RSA ID Plus.
This guide is not intended to suggest optimum installations or configurations. It is assumed the reader has both working knowledge of all products involved and the ability to perform the tasks outlined in this section. Administrators should have access to the product documentation for all products to install the required components. All RSA ID Plus and Microsoft Sentinel components must be installed and working prior to the integration.
Prerequisites and References
- The consumption-based plan of Microsoft Sentinel is used for this guide. Refer to Microsoft documentation for any changes to the configuration based on your plan selection.
- We have used a single region across the configurations. Refer to Microsoft documentation for any changes to the configuration in case of multi-region deployments.
- Make sure that the necessary permissions to create or set up the resources /components are available. Refer to Microsoft's official documentation for more information.
Configure RSA Cloud Access Service
Perform these steps to configure RSA Cloud Access Service (CAS).
Procedure
- Sign in to RSA Cloud Administration Console.
- Click Platform > API Access Management.
- Expand the Administration API Keys section and add Access Key.
- In the Administrator Role list, choose Super Administrator.
- Use the API key generated to create an access token. Sample code is provided by RSA to generate an access token from the API key. For details, refer to Authentication for the Cloud Administration APIs.
Configure Microsoft
Perform the steps in the following subsections to configure Microsoft.
Log Analytics Workspace
- Log in to Azure and select Log Analytics workspace.
- Click Create.
- Select the Subscription, Resource group, and Region.
- Provide a Name. We have used “East US “across our deployment as the Region.
Custom Tables
PowerShell is used to create a custom table. Ensure the version of PowerShell supports the activities.
- Open the PowerShell console as an administrator and run the following commands.
Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser //To set the policy
Install-Module Az -Repository PSGallery -Force -AllowClobber //To install Azure module
Import-Module Az //To verify installation
Connect-AzAccount -DeviceCode //To connect to the Powershell azure account
If the “Unable to acquire token for tenant…” error occurs, try reconnecting and then run the following command to check if the token is granted properly.
- Define the schema of the custom table and provide a name. Store the schema in a variable named tableParams.
- There should always be a column named TimeGenerated of type datetime. The rest of the columns have values coming from RSA.
- Name of the custom table should end with _CL.
- Add only the necessary columns as per the requirement to reduce the ingestion costs.
$tableParams = @'
{
"properties": {
"schema": {
"name": "MyTable_CL",
"columns": [
{
"name": "TimeGenerated",
"type": "datetime",
"description": "The time at which the data was generated"
},
{
"name": "eventId",
"type": "long",
"description": "Event ID"
},
{
"name": "eventLogDate",
"type": "datetime",
"description": "Time and date of event logged"
},
{
"name": "eventType",
"type": "string",
"description": "Type of event"
},
{
"name": "serverURL",
"type": "string",
"description": "serverURL"
},
{
"name": "serverIPAddress",
"type": "string",
"description": "serverIPAddress"
},
{
"name": "application",
"type": "string",
"description": "application"
},
{
"name": "customerId",
"type": "int",
"description": "customerId"
},
{
"name": "customerName",
"type": "string",
"description": "customer Name"
},
{
"name": "sourceIPAddress",
"type": "string",
"description": "Source IP Address"
},
{
"name": "adminUserName",
"type": "string",
"description": "admin user name"
},
{
"name": "adminUserRole",
"type": "string",
"description": "admin user role"
},
{
"name": "activityKey",
"type": "string",
"description": "activity Key"
},
{
"name": "activityCode",
"type": "int",
"description": "activity Code"
},
{
"name": "result",
"type": "string",
"description": "result"
},
{
"name": "reasonKey",
"type": "string",
"description": "reason Key"
},
{
"name": "message",
"type": "string",
"description": "message"
},{
"name": "requiresPublish",
"type": "boolean",
"description": "requires Publish"
}
]
}
}
}
'@
- Run the following commands to create the custom table.
$ subscriptionID =<subscriptionID >
$ resourcegroupname =<Name of the resource group>
$ LAWname=<Name of the Log Analytics workspace>
Invoke-AzRestMethod -Path "/subscriptions/$subscriptionID/resourcegroups/$resourcegroupname/providers/microsoft.operationalinsights/workspaces/$LAWname/tables/MyTable_CL?api-version=2022-10-01" -Method PUT -payload $tableParams
Data Collection Rule
- After the custom rule is created, design a DCR schema as follows.
- Make sure the schema of the table defined here under streamDeclarations matches the schema of the custom table defined earlier.
- Stream names should start with 'custom-'. We have used 'Custom-MyTable'.
- Destination is the name of the Log Analytics workspace.
- Use transformkql under the dataFlows section to manipulate the value of TimeGenerated column. This can be the value of eventLogDate coming from RSA. We have used currentdateandtime.
{
"$schema": "https://schema.management.azure.com/schemas/2019-08-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"dataCollectionRuleName": {
"type": "string",
"metadata": {
"description": "Specifies the name of the Data Collection Rule to create."
}
},
"location": {
"type": "string",
"metadata": {
"description": "Specifies the location in which to create the Data Collection Rule."
}
},
"workspaceResourceId": {
"type": "string",
"metadata": {
"description": "Specifies the Azure resource ID of the Log Analytics workspace to use."
}
}
},
"resources": [
{
"type": "Microsoft.Insights/dataCollectionRules",
"name": "[parameters('dataCollectionRuleName')]",
"location": "[parameters('location')]",
"apiVersion": "2023-03-11",
"kind": "Direct",
"properties": {
"streamDeclarations": {
"Custom-MyTable": {
"columns": [
{
"name": "TimeGenerated",
"type": "datetime"
},
{
"name": "eventId",
"type": "long"
},
{
"name": "eventLogDate",
"type": "datetime"
},
{
"name": "eventType",
"type": "string"
},
{
"name": "serverURL",
"type": "string"
},
{
"name": "serverIPAddress",
"type": "string"
},
{
"name": "application",
"type": "string"
},
{
"name": "customerId",
"type": "int"
},
{
"name": "customerName",
"type": "string"
},
{
"name": "sourceIPAddress",
"type": "string"
},
{
"name": "adminUserName",
"type": "string"
},
{
"name": "adminUserRole",
"type": "string"
},
{
"name": "activityKey",
"type": "string"
},
{
"name": "activityCode",
"type": "int"
},
{
"name": "result",
"type": "string"
},
{
"name": "reasonKey",
"type": "string"
},
{
"name": "message",
"type": "string"
},
{
"name": "requiresPublish",
"type": "boolean"
}
]
}
},
"destinations": {
"logAnalytics": [
{
"workspaceResourceId": "[parameters('workspaceResourceId')]",
"name": "RSAAdminLogsLAWorkspace"
}
]
},
"dataFlows": [
{
"streams": [
"Custom-MyTable"
],
"destinations": [
"RSAAdminLogsLAWorkspace"
],
"transformKql": "source | extend TimeGenerated = now()",
"outputStream": "Custom-MyTable_CL"
}
]
}
}
],
"outputs": {
"dataCollectionRuleId": {
"type": "string",
"value": "[resourceId('Microsoft.Insights/dataCollectionRules', parameters('dataCollectionRuleName'))]"
}
}
}
- Log in to Azure and type Deploy custom template in the search bar and select the service.
- Click Build your own template in the editor.
- Paste the DCR schema created and copied, and then click Save.
- Provide the details on the resulting form (Ensure the location is the same as the location of the log analytics workspace) and click Review+create. LAW id is the Log Analytics Workspace ID.
- After the DCR is created, click Overview of the DCR, and then click JSON view.
- Copy the Immutable ID and Logs ingestion URI for the DCR.
Logic App
- Log in to the Microsoft Azure tenant and select Logic Apps.
- Click Create and add the following details (We have selected the Consumption Multi-tenant plan).
- Click Review + create > create.
- Open the created Logic App and go to Settings > Identity.
- Turn the status toggle button under System assigned to On and click Save.
- Go to Monitor > Data Collection Rule and select the DCR created.
- In the left pane, select Access control (IAM) and click Add > Add role assignment.
- Search for 'Monitoring Metrics Publisher' and assign the role against the Managed Identity (logic app).
- Go to Log Analytics and select your workspace.
- Click Access control (IAM) > Add > Add role assignment.
- Assign the role Contributor to the Managed Identity (logic app) and click Next.
- Go to Monitoring > Diagnostic settings and make sure allLogs checkbox is selected.
- Select the Logic app and then go to Development Tools > Logic app designer in the left pane.
- Design a workflow based on your requirements. We have designed a workflow as in the following screenshot.
- In the first HTTP Action, provide the following details:
- URL: https://<Tenant>.auth.securid.com/AdminInterface/restapi/v1/adminlog/exportlogs
- Method: GET
- Headers:
- Authorization: Bearer <Generated token>
- Host: <Tenant>.auth.securid.com
- In the Parse JSON action, provide the following details:
- Content: Body from the output of the previous HTTP action in the Dynamic Content List.
- Schema: Use the option Use sample payload to generate schema to generate a schema. You can use postman to obtain sample payload.
Schema generated for our purpose :
{
"type": "object",
"properties": {
"totalPages": {
"type": "integer"
},
"totalElements": {
"type": "integer"
},
"pageSize": {
"type": "integer"
},
"currentPage": {
"type": "integer"
},
"elements": {
"type": "array",
"items": {
"type": "object",
"properties": {
"eventId": {
"type": "integer"
},
"eventLogDate": {
"type": "string"
},
"eventType": {
"type": "string"
},
"serverURL": {
"type": "string"
},
"serverIPAddress": {
"type": "string"
},
"application": {
"type": "string"
},
"customerId": {
"type": "integer"
},
"customerName": {
"type": "string"
},
"sourceIPAddress": {
"type": "string"
},
"adminUserName": {
"type": "string"
},
"adminUserRole": {
"type": "string"
},
"activityKey": {
"type": "string"
},
"activityCode": {
"type": "integer"
},
"result": {
"type": "string"
},
"reasonKey": {
"type": "string"
},
"message": {
"type": "string"
},
"requiresPublish": {
"type": "boolean"
},
"targetObject1Id": {},
"targetObject1Name": {},
"targetObject1Type": {},
"targetObject2Id": {},
"targetObject2Name": {},
"targetObject2Type": {}
},
"required": [
"eventId",
"eventLogDate",
"eventType",
"serverURL",
"serverIPAddress",
"application",
"customerId",
"customerName",
"sourceIPAddress",
"adminUserName",
"adminUserRole",
"activityKey",
"activityCode",
"result",
"reasonKey",
"message",
"requiresPublish",
"targetObject1Id",
"targetObject1Name",
"targetObject1Type",
"targetObject2Id",
"targetObject2Name",
"targetObject2Type"
]
}
}
}
}
- Provide the following details in the second HTTP action.
- URL: {DCR Log Ingestion Endpoint}/dataCollectionRules/{DCR Immutable ID}/streams/{Stream Name}?api-version=2023-01-01
- Method: POST
- Headers
- Content-Type=application/json
- Body
- Body elements from the response of the previous action.
- Body elements from the response of the previous action.
- Authentication
- Authentication Type=Managed Identity
- Managed Identity=System-assigned managed identity
- Audience=https://monitor.azure.com
- Save the workflow. Run the workflow to test it. Post successful run, data returned from RSA should be available in the created custom table.
Microsoft Sentinel
- Select the Microsoft Sentinel service from Azure and click Create.
- Select the created Log Analytics workspace and click Add.
- In the left pane, click Analytics.
- Click the redirection link to the Defender portal.
- In the left pane, click Settings and select Microsoft Sentinel.
- Make the Log Analytics workspace created earlier the primary.
- Click Configuration > Analytics > Create > Scheduled query rule.
- Provide the details and click Next: Set rule logic.
- In the Rule Query text box, provide the query.
- Make changes to query scheduling, Alert threshold, and Event grouping as per the business requirement, and then click Next.
- Make changes to the Incident settings and click Next: Automated Response.
- Add Automation rules, if any, and click Next.
- Validate and click Save.
The configuration is complete.
Related Articles
CrowdStrike Falcon Next-Gen SIEM – RSA Ready Implementation Guide 5Number of Views CrowdStrike Falcon Next-Gen SIEM - Authentication Manager - RSA Ready Implementation Guide 2Number of Views RSA Identity Governance and Lifecycle 7.0.2 upgrade fails during schema migration with 'ORA-06512: at "AVUSER.SIEM_INTEGRA… 286Number of Views Microsoft Office 365 - SAML IDR SSO Configuration - RSA Ready Implementation Guide 59Number of Views RSA DLP Sample of DLP Syslog Messages sent to SIEM 28Number of Views
Trending Articles
Passwordless Authentication in Windows MFA Agent for Active Directory – Quick Setup Guide RSA Authentication Manager Upgrade Process RSA Authentication Manager 8.9 Release Notes (January 2026) An example of SSO using SAML and ADFS with RSA Identity Management and Governance 6.9.x RSA MFA Agent 2.3.6 for Microsoft Windows Installation and Administration Guide