+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Ingest logs to OCI Logging Analytics using Fluentd
https://docs.oracle.com/en/learn/oci_logging_analytics_fluentd/#introduction
Fluent Frequently Asked Questions:
https://www.fluentd.org/faqs
Installing Fluentd
https://docs.fluentd.org/v/0.12/articles/install-by-rpm
Fluentd : Quick Overview on Fluentd - YOUTUBE Videos
https://www.youtube.com/watch?v=Gp0-7oVOtPw
Refer to the quick start tutorial if you want a quick experience of setting up and using Logging Analytics. See Oracle Cloud Infrastructure Logging Analytics Quick Start Guide
https://www.youtube.com/playlist?list=PLiuPvpy8QsiV_QT9A-pECFkK30yMJEXOu
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Ingest Logs :
Ingest Logs from Other OCI Services Using Service Connector
https://docs.oracle.com/en-us/iaas/logging-analytics/doc/ingest-logs.html
how the Service Connector Hub works
Overview of Service Connector
Ingest Logs to OCI Logging Analytics Using Fluentd
---------------------------------------------------------------------------------------------------------------------------
PLAN BELOW
It is recommended that you follow the streamlined onboarding process presented in the OCI console to complete the minimal prerequisites to start using Logging Analytics.
Ingest Logs to OCI Logging Analytics Using Fluentd
Step 1 :
Fluentd has components which work together to collect the log data from the input sources, transform the logs, and route the log data to the desired output. You can install and configure the output plugin for Fluentd to ingest logs from various sources into Oracle Cloud Logging Analytics.
Prerequisites
Install Fluentd and Input Plugins:
we recommend to use td-agent 2.5
Before Installation - Pre-requisites :
Step 1: Install from rpm Repository
CentOS and RHEL 5, 6, 7 and Amazon Linux are currently supported.
Executing install-redhat-td-agent2.sh will automatically install td-agent on your machine. This shell script registers a new rpm repository at /etc/yum.repos.d/td.repo and installs the td-agent rpm package.
# td-agent 2.5 or later. Only CentOS/RHEL 6 and 7 for now.
$ curl -L https://toolbelt.treasuredata.com/sh/install-redhat-td-agent2.5.sh | sh
# td-agent 2.3 or earlier
$ curl -L https://toolbelt.treasuredata.com/sh/install-redhat-td-agent2.sh | sh
Step2: Launch Daemon
The /etc/init.d/td-agent script is provided to start, stop, or restart the agent.
$ sudo /etc/init.d/td-agent start
Starting td-agent: [ OK ]
$ sudo /etc/init.d/td-agent status
td-agent (pid 21678) is running...
The following commands are supported:
$ sudo /etc/init.d/td-agent start
$ sudo /etc/init.d/td-agent stop
$ sudo /etc/init.d/td-agent restart
$ sudo /etc/init.d/td-agent status
Please make sure your configuration file is located at /etc/td-agent/td-agent.conf.
Configuration File
The configuration file allows the user to control the input and output behavior of Fluentd by
(1) selecting input and output plugins and (2) specifying the plugin parameters. The file is
required for Fluentd to operate properly.
Config File Location
List of Directives
The configuration file consists of the following directives:
source : directives determine the input sources.
match : directives determine the output destinations.
filter : directives determine the event processing pipelines.
system : directives set system wide configuration.
label : directives group the output and filter for internal routing
include : directives include other files.
(1) "source": where all the data come from
Fluentd's input sources are enabled by selecting and configuring the desired input plugins using source directives. Fluentd's standard input plugins include http and forward. http turns fluentd into an HTTP endpoint to accept incoming HTTP messages whereas forward turns fluentd into a TCP endpoint to accept TCP packets. Of course, it can be both at the same time (You can add as many sources as you wish)
# Receive events from 24224/tcp
# This is used by log forwarding and the fluent-cat command
<source>
@type forward
port 24224
</source>
# http://this.host:9880/myapp.access?json={"event":"data"}
<source>
@type http
port 9880
</source>
Each source directive must include a @type parameter. The @type parameter specifies which input plugin to use.
Interlude: Routing
The source submits events into the Fluentd's routing engine. An event consists of three entities: tag, time and record. The tag is a string separated by '.'s (e.g. myapp.access), and is used as the directions for Fluentd's internal routing engine. The time field is specified by input plugins, and it must be in the Unix time format. The record is a JSON object. Fluentd accepts all non-period characters as a part of a tag. However, since the tag is sometimes used in a different context by output destinations (e.g., table name, database name, key name, etc.), it is strongly recommended that you stick to the lower-case alphabets, digits and underscore, e.g., ^[a-z0-9_]+$.
In the example above, the HTTP input plugin submits the following event:
# generated by http://this.host:9880/myapp.access?json={"event":"data"}
2 tag: myapp.access
3 time: (current time)
4 record: {"event":"data"}
Didn't find your input source? You can write your own plugin!
You can add new input sources by writing your own plugins. For further information regarding Fluentd's input sources, please refer to the Input Plugin Overview article.
(2) "match": Tell fluentd what to do!
The "match" directive looks for events with matching tags and processes them. The most common use of the match directive is to output events to other systems (for this reason, the plugins that correspond to the match directive are called "output plugins"). Fluentd's standard output plugins include file and forward. Let's add those to our configuration file.
# Receive events from 24224/tcp
# This is used by log forwarding and the fluent-cat command
<source>
@type forward
port 24224
</source>
# http://this.host:9880/myapp.access?json={"event":"data"}
<source>
@type http
port 9880
</source>
# Match events tagged with "myapp.access" and
# store them to /var/log/fluent/access.%Y-%m-%d
# Of course, you can control how you partition your data
# with the time_slice_format option.
<match myapp.access>
@type file
path /var/log/fluent/access
</match>
Each match directive must include a match pattern and a @type parameter. Only events with a tag matching the pattern will be sent to the output destination (in the above example, only the events with the tag "myapp.access" is matched. See the section below for more advanced usage). The @type parameter specifies the output plugin to use. Just like input sources, you can add new output destinations by writing your own plugins. For further information regarding Fluentd's output destinations, please refer to the Output Plugin Overview article.
===Check the Page for more
2 . Understand the Hierarchy of Key Resources: Entities, sources, and parsers are some of the key resources in Oracle Cloud Logging Analytics which are used for setting up the log collection. Understand their interdependencies to perform the prerequisite tasks before you can start viewing the logs in the Log Explorer.
To get started with Oracle Cloud Logging Analytics, follow these steps.
Topics:
Enable Access to Logging Analytics and Its Resources
- Enable Access from Logging Analytics to Its Features Family
- Identify OCI Compartments to Place the Logging Analytics Resources
- Create User Groups to Implement Access Control
- Grant Access to User Groups
- Enable Logging Analytics
- After completing the prerequisite tasks such as creating user groups, creating compartments, and defining access policies for the user groups, you can access Oracle Cloud Logging Analytics and enable it for use.
- Open the navigation menu, click Observability & Management, and then click Logging Analytics.
- If this is the first time that you are using the service in this region, you will land on an on-boarding page that will give you some high level details of the service and an option to start using Oracle Cloud Logging Analytics service. Click Start Using Logging Analytics.
- The Enable Logging Analytics dialog box is displayed. Here, the minimum required policies and log group are created if they don't exist already.
- Click Set Up Ingestion to initiate ingesting the logs.
Confirm if you want to install the Management Agent to continuously collect logs from the host. Also, you can confirm if you want to collect OCI Audit Logs for analysis. Based on your preference, the policies are created and suitable actions performed.
After the on-boarding is complete, you can explore Oracle Cloud Logging Analytics.
Create Logging Analytics Resources
You must perform the following tasks to start viewing your log data in Oracle Cloud Logging Analytics:
1. Create Log Groups to Store Your LogsOpen the navigation menu and click Observability & Management. Under Logging Analytics, click Administration. The Administration Overview page opens.
Click Log Groups from the list of available resources.
Ensure your compartment selector on the left indicates you are in the desired compartment for this new log group.
Click Create > Enter Name of the log group > Enter Description of the log group > Click Save.
2. Create an Entity to Represent Your Log-Emitting Resource
- Open the navigation menu and click Observability & Management. Under Logging Analytics, click Administration. The Administration Overview page opens.
- Click Entities from the list of available resources.
- Ensure your compartment selector on the left indicates you are in the desired compartment for this new entity.
- Click Create. Select an Entity Type, provide a Name and if you have any properties that can be used to indicate the file paths, add those properties to the entity. Some entity types already have parameters that need to be filled out for the entity to make use of an Oracle-Defined log source. Click Save.
3. Configure Management Dashboard
This we are keeping as optional for now
4. Minimum Set of IAM Policies Required to Use Logging Analytics
After you complete the previous configuration steps, you must have the following policy statements to efficiently use the service:
allow service loganalytics to READ loganalytics-features-family in tenancy
allow group Logging-Analytics-SuperAdmins to READ compartments in tenancy
allow group Logging-Analytics-SuperAdmins to MANAGE loganalytics-features-family in tenancy
allow group Logging-Analytics-SuperAdmins to MANAGE loganalytics-resources-family in tenancy
allow group Logging-Analytics-SuperAdmins to READ metrics IN tenancy
allow group Logging-Analytics-SuperAdmins to READ users IN tenancy
allow group Logging-Analytics-SuperAdmins to MANAGE management-dashboard-family in tenancy
In the above set, the last statement corresponds to the permissions required to create and use Management Dashboard from Oracle Cloud Logging Analytics.
You can define the access to the Individual Resource Types at compartment level instead of tenancy level for more fine-grained log access control.
The list of resources can be viewed at
SECONDLY - Since we are using FluentD , use the below link which helps us to understand how to ingest logging from your host servers to OCI logging .
Diagram on how Fluentd architecture works
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
OCI Logging
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Managing Logs and Log Groups
This topic describes how to manage logs and log groups.
Overview of Logs and Log Groups
Logs contain critical diagnostic information that tells you how your resources are performing and being accessed. You can enable logging on supported resources. To see a list of supported resources grouped by service, see Supported Services.
Log groups are logical containers for organizing logs. Logs must always be inside log groups. You must create a log group to enable a log.
Use log groups to limit access to sensitive logs with IAM policy. With log groups, you don't have to rely on complex compartment hierarchies to secure your logs. For example, say the default log group in a single compartment is where you store logs for the entire tenancy. You grant access to the compartment for log administrators with IAM policy as you normally would. However, let's say some projects contain personally identifiable information (PII) and those logs can only be viewed by a select group of log administrators. Log groups allow you to put logs that contain PII into a separate log group, and then use IAM policy to restrict access to all but a few log administrators.
Overview
How Logging Works
Use Logging to enable, manage, and search logs. The three kinds of logs are the following:
- Audit logs: Logs related to events emitted by the Oracle Cloud Infrastructure Audit service. These logs are available from the Logging Audit page, or are searchable on the Search page alongside the rest of your logs.
- Service logs: Emitted by OCI native services, such as API Gateway, Events, Functions, Load Balancing, Object Storage, and VCN Flow Logs. Each of these supported services has pre-defined logging categories that you can enable or disable on your respective resources.
- Custom logs: Logs that contain diagnostic information from custom applications, other cloud providers, or an on-premises environment. Custom logs can be ingested through the API, or by configuring the Unified Monitoring Agent. You can configure an OCI compute instance/resource to directly upload Custom Logs through the Unified Monitoring Agent. Custom logs are supported in both a virtual machine and bare metal scenario.
Audit Logs :
On the Audit page, you can explore audit logs. Audit logs are also searchable on the Search page, and you can view Audit logs in every compartment by selecting the /_Audit log group on the Search page. For an overview of Audit,
Details for the Audit Service
This topic covers details for writing policies to control access to the Audit service.
Required Permissions for Searching Logs
To search indexed logs, a user must have the read permission on the log content and read access to the log group.
allow group GroupA to read log-groups in tenancy
allow group GroupA to read log-content in tenancy
To search indexed logs, you must have access to the log group that contains the indexed logs. For more information
To view and search Audit Logs, you must also have the corresponding Audit-related permissions
- search "compartment" requires AUDIT_EVENT_READ, and if there are any log objects, it would also require LOG_CONTENT_READ
- search "compartment/_Audit" requires just AUDIT_EVENT_READ.
- search "compartmentOcid/logGroupNameOrOcid/logNameOrOcid" requires LOG_CONTENT_READ only.
- search "compartmentOcid1/_Audit" "compartmentOcid2/logGroupNameOrOcid/logNameOrOcid" requires LOG_CONTENT_READ on logGroupNameOrOcid and AUDIT_EVENT_READ on compartmentOcid1.
Filtering Audit Logs
To filter Audit logs:
- Open the navigation menu and click Observability & Management. Under Logging, click Audit. The list of audit logs in the current compartment is displayed.
- Choose a compartment you have permission to work in.
- In User, add user filters. Multiple users can be added.
- In Resource, add resource filters. Multiple resources can be filtered on.
- In Request action types, select an action operation:
Multiple request action types can be filtered on.
6. In Event type, add event filters. Multiple event filters can be added.
7. In Custom filters, start typing to automatically display filter settings, along with operators. For example, entering d displays filters starting with that letter. Use the up or down arrow keys to select from the list, or continue typing to enter what you want to filter on. This functions the same as this field on the Logging Search page.
8 . In Filter by time, select from one of the preset time periods:
- Past 5 Minutes (the default)
- Past 15 Minutes
- Past Hour
- Past 3 Hours
- Today
- Custom (choose your own using the Start Date and End Date fields)
9. After entering your search text or filters, click Apply.
Service Logs :
Service Logs
Oracle Cloud Infrastructure services emit service logs. Each of these supported services has a Logs resource that allows you to enable or disable logging for that service.
Supported Services
You can enable service logs for the following Oracle Cloud Infrastructure services:
- API Gateway
- DevOps
- Email Delivery
- Events
- Functions
- Integration
- Load Balancing
- Object Storage
- Site-to-Site VPN
Site-to-Site VPN logs are only supported with v2 IPSec connections. v1 connections are not supported.
- VCN Flow Logs
- Web Application Firewall
Enabling Logging for a Resource :
For a newly created resource, logging is automatically enabled. For a resource you want to enable logging on, under Resources click Logs, and then toggle Enable Logging. The Create Log panel is displayed, and the entry fields are pre-populated:
- Compartment (the same as your resource)
- Log Group: The first log group in your compartment. You can select another log group, or create a new group by clicking Create New Group.
- Log Name: Pre-populated as the name of your resource and the category, which are combined with an underscore (<resource>_<category>). For example, if the resource is named "resource" and the category is "ruleexecutionlog", the log name is "resource_ruleexecutionlog".
- Log Retention: The default retention period for the log. You can select a different retention period.
After logging is enabled, you can click the link under Log name or Log group to view the log details or log group details pages, respectively.
To disable logging, toggle the Enable log control, which displays a disable logging confirmation dialog. Click Disable Log to confirm. The Status field is set to INACTIVE to indicate the inactive status.
When creating a log, a log object is established. To delete the log, select Delete from the action menu. A confirmation is displayed confirming whether you want to delete the log. After clicking Delete, this removes the log object, as opposed to disabling it (which means the log object still exists but does not record new data into it).
Enabling Logging on the Logs page
- Open the navigation menu and click Observability & Management. Under Logging, click Logs. The Logs page is displayed.
- Click Enable service log. The Enable Resource Log panel is displayed.
- Under Select Resource, Resource Compartment, choose a compartment you have permission to work in.
- Select the service for which you want to enable resource logging from the Service list (see Supported Services for the available services).
- In Resource select a resource.
- Configure the log:
In Log Category select a log category to specify the type of log to create. For example, Object Storage buckets have categories for read and write. Select read to enable a log with only read events. Select write for a log with only write events.
- You can only have one log for any combination of service, resource, and log category. For example, Object Storage buckets have two categories: read and write. Therefore:
- You can enable a single read log and a single write log for every bucket in your tenancy.
- You cannot enable more than two logs (one read and one write) for any one bucket.
In Log Name, type a name for the log. See Log and Log Group Names for more information. Avoid entering confidential information. Select Enable Legacy Archival Logs to automatically create a bucket in your compartment, and place a copy of your log there. See Legacy Archival for more information.
To specify the log location (click Show Advanced Options if necessary):
7. To specify the log location (click Show Advanced Options if necessary): 8. In Log Retention, select a value from the list:
- 1 month (the default)
- 2 months
- 3 months
- 4 months
- 5 months
- 6 months
9. Apply any tagging-related information in the Tag Namespace, Tag Key, and Value fields.
10. Click Enable Log.
The Log detail page is displayed, and the log is in the process of being created (a "Creating log" message is displayed). See Using the Console for more information on viewing and using this page.
Custom Logs
Custom logs are logs that contain diagnostic information from custom applications, other cloud providers, or an on-premise environment. Custom logs can be ingested in the following ways:
- By using PutLogs to ingest custom logs directly. See the Logging Ingestion API and REST APIs for more information. Also see Using the API for an example log entry payload that can be used with PutLogs.
- By configuring the Unified Monitoring Agent. See Installing the Agent for instructions.
Custom logs can be viewed in the Oracle Cloud Infrastructure Compute instance page, and have an associated Logs resource. They can also be viewed on the Logging Search page, Logs page, or within an associated Log Groups detail page. Custom logs are also supported in bare metal instances.
Creating Custom Logs
To create custom logs:
- Open the navigation menu and click Observability & Management. Under Logging, click Logs.
- Under List Scope, Compartment, choose a compartment you have permission to work in.
- Click Create custom log. The Create custom log panel is displayed.
- In Custom log name, enter a name for the custom log. Avoid entering confidential information.From Compartment, choose a compartment you have permission to work in.
- From Log group, select a log group to place the custom log into.
- Optionally, select a log retention value from Log Retention, and add any applicable tags in Add Tags.
- Click Create custom log. The Create agent configuration panel is displayed. You can next create a new configuration, to define the parameters for the associated log data (the default), or add it later.
- Enter a Configuration name in the corresponding field, and select a Compartment you have permissions to work in.
- In Host Groups, which allows you to define which VMs apply to this configuration, select a Group type from the list, whether Dynamic group or User group.
- For the Dynamic group case, Dynamic Group refers to a group of instances, which you can create in the IAM feature of the Console. See About Dynamic Groups for more information. These Dynamic Groups can be selected from the Group field when setting up Dynamic Group settings.
For the User group case, select the group from the Group field. User Groups also refer to the IAM Groups feature of the Console. See Managing Groups for more information.
Click + Another host group to add more groups. You can add a combination of Group Types for the agent configuration, that is, both Dynamic groups and User groups can be set up in the configuration.
NOTE
A maximum of five groups per configuration are allowed, and a host can be in a maximum of five different groups.
Next, in the configuration, you need to define the format of the logs (that is, what logs do you want to watch for) in Configure log inputs. Select an Input type form the list, whether Windows event log or Log path.
For Windows event log, enter an Input name and select an Event channels option from the list.
For Log path, enter an Input name and File paths in the corresponding fields. For example, /<log_path>/<log_name>. Multiple paths can be entered.
<source>
@type tail
tag 757261.oc_oslogs_linux
path /var/log/.log,/var/log/.out,/var/log/dmesg,var/log/grubby,/var/log/messages*,var/log/secure,/var/log/auth,/var/log/acpid,/root/.bash_history
pos_file /etc/unifiedmonitoringagent/pos/757261-oc_oslogs_linux.pos
path_key tailed_path
</source>
Example configuration:
{{path C:\Program Files (x86)\<application>\<directory>*, C:\Program Files (x86)\<application>\<application_logs_directory>\<directory>* }}
Click Advanced parser options, which opens the Advanced parser options panel. This allows you to specify how to parse the log, according to the following parsers. Some of the parsers require further input and have more options, depending on the type chosen.
- AUDITD
- JSON
- TSV
- CSV
- NONE (the default)
- SYSLOG
- APACHE2
- APACHE_ERROR
- MSGPACK
- REGEXP
- MULTILINE
For example for JSON, you must select a Time type value from the list, while optionally, you can specify event time and null field settings. Meanwhile for REGEXP, you specify the regular expression for matching logs, along with the time format. See Log Inputs and Parsers for more information.
12. After configuring the log inputs and the parser, you can optionally specify any tag settings. Click Create custom log to save your changes, and create the custom log and its associated agent configuration.
Agent Management -- I think this speak about Oracle Agent we don't need this
Viewing Custom Logs in a Compute Instance
Comments
Post a Comment