This document explains how to create and manage log sinks, which route log entries that originate in a Google Cloud project to supported destinations.
A sink performs a write action and therefore it must be authorized to write to the destination. When the destination is a log bucket in the same project as the sink, the sink is automatically authorized. For all other destinations, the sink must be attached to a service account that has been granted the permissions required to write data to the destination.
When a service account is required, Cloud Logging automatically creates and manages it. However, you might need to modify the permissions granted to the service account. You don't have to use the service account created by Logging. You can create and manage a service account that is used by sinks in multiple projects. For more information, see Configure log sinks with user-managed service accounts.
Overview
This page describes how to create a sink and how to configure the options you might see when using the Google Cloud console or the API.
Sinks belong to a given Google Cloud resource: a Google Cloud project, a billing account, a folder, or an organization. When the resource receives a log entry, every sink in the resource processes the log entry. When a log entry matches the filters of the sink, then the log entry is routed to the sink's destination.
Typically, sinks only route the log entries that originate in a resource. However, for folders and organizations you can create aggregated sinks, which route log entries from the folder or organization, and the resources it contains. This document doesn't discuss aggregated sinks. For more information, see Aggregated sinks overview.
To create and manage sinks, you can use the Google Cloud console, the Cloud Logging API, and the Google Cloud CLI. We recommend that you use the Google Cloud console:
- The Logs Router page lists all sinks and provides options to manage your sinks.
- When creating a sink, you can preview which log entries are matched by the sink's filters.
- You can configure sink destinations when creating a sink.
- Some authorization steps are completed for you.
Supported destinations
The destination of a sink can be in a different resource than the sink. For example, you can use a log sink to route log entries from one project to a log bucket stored in a different project.
The following destinations are supported:
- Google Cloud project
Select this destination when you want the log sinks in the destination project to reroute your log entries, or when you have created an intercepting aggregated sink. The log sinks in the project that is the sink destination can reroute the log entries to any supported destination except a project.
- Log bucket
Select this destination when you want to store your log data in resources managed by Cloud Logging. Log data stored in log buckets can be viewed and analyzed using services like the Logs Explorer and Log Analytics.
If you want to join your log data with other business data, then you can store your log data in a log bucket and create a linked BigQuery dataset. A linked dataset is a read-only dataset that can be queried like any other BigQuery dataset.
- BigQuery dataset
- Select this destination when you want to join your log data with other business data. The dataset you specify must be write-enabled. Don't set the destination of a sink to be a linked BigQuery dataset. Linked datasets are read-only.
- Cloud Storage bucket
- Select this destination when you want long-term storage of your log data. The Cloud Storage bucket can be in the same project in which log entries originate, or in a different project. Log entries are stored as JSON files.
- Pub/Sub topic
- Select this destination when you want to export your log data from Google Cloud and then use third-party integrations like Splunk or Datadog. Log entries are formatted into JSON and then routed to a Pub/Sub topic.
Destination limitations
This section describes destination-specific limitations:
- If you route log entries to a log bucket in a different Google Cloud project, then Error Reporting doesn't analyze those log entries. For more information, see Error Reporting overview.
- If you route log entries to a BigQuery dataset, the BigQuery dataset must be write-enabled. You can't route log entries to linked datasets, which are read-only.
- New sinks that route log data to Cloud Storage buckets might take several hours to start routing log entries. These sinks are processed hourly.
The following limitations apply when the destination of a log sink is a Google Cloud project:
- There is a one-hop limit.
- Log entries that match the filter of the
_Required
log sink are only routed to the_Required
log bucket of the destination project when they originate in the destination project. - Only aggregated sinks that are in the resource hierarchy of a log entry process the log entry.
For example, assume the destination of a log sink in project
A
is projectB
. Then the following are true:- Due to the one-hop limit, the log sinks in project
B
can't reroute log entries to a Google Cloud project. - The
_Required
log bucket of projectB
only stores log entries that originate in projectB
. This log bucket doesn't store any log entries that originate in any other resource, including those that originate in projectA
. - If the resource hierarchy of project
A
and projectB
differ, then a log entry that a log sink in projectA
routes to projectB
isn't sent to the aggregated sinks in the resource hierarchy of projectB
. - If project
A
and projectB
have the same resource hierarchy, then log entries are sent to the aggregated sinks in that hierarchy. If a log entry isn't intercepted by an aggregated sink, then the Log Router sends the log entry to the sinks in projectA
.
Before you begin
The instructions in this document describe creating and managing sinks at the Google Cloud project level. You can use the same procedure to create a sink that routes log entries that originate in an organization, folder, or billing account.
To get started, do the following:
-
Enable the Cloud Logging API.
Make sure that your Google Cloud project contains log entries that you can see in the Logs Explorer.
-
To get the permissions that you need to create, modify, or delete a sink, ask your administrator to grant you the Logs Configuration Writer (
roles/logging.configWriter
) IAM role on your project. For more information about granting roles, see Manage access to projects, folders, and organizations.You might also be able to get the required permissions through custom roles or other predefined roles.
For information about granting IAM roles, see the Logging Access control guide.
You have a resource in a supported destination or have the ability to create one.
To route log entries to a destination, the destination must exist before you create the sink. You can create the destination in any Google Cloud project in any organization.
Before you create a sink, review the limitations that apply for the sink destination. For more information, see the Destination limitations section in this document.
Select the tab for how you plan to use the samples on this page:
Console
When you use the Google Cloud console to access Google Cloud services and APIs, you don't need to set up authentication.
gcloud
In the Google Cloud console, activate Cloud Shell.
At the bottom of the Google Cloud console, a Cloud Shell session starts and displays a command-line prompt. Cloud Shell is a shell environment with the Google Cloud CLI already installed and with values already set for your current project. It can take a few seconds for the session to initialize.
REST
To use the REST API samples on this page in a local development environment, you use the credentials you provide to the gcloud CLI.
After installing the Google Cloud CLI, initialize it by running the following command:
gcloud init
If you're using an external identity provider (IdP), you must first sign in to the gcloud CLI with your federated identity.
For more information, see Authenticate for using REST in the Google Cloud authentication documentation.
Create a sink
This section describes how to create a sink in a Google Cloud project.
You can create up to 200 sinks per Google Cloud project.
To view the number and volume of log entries that are routed, view the
logging.googleapis.com/exports/
metrics.
You use the Logging query language to create a filter expression that matches the log entries you want to include. Don't put sensitive information in sink filters. Sink filters are treated as service data.
When a query contains multiple statements,
you can either specify how those statements are joined or rely on Cloud Logging implicitly
adding the conjunctive restriction, AND
, between the statements. For example,
suppose a query or filter dialog contains two statements,
resource.type = "gce_instance"
and severity >= "ERROR"
.
The actual query is resource.type = "gce_instance" AND severity >= "ERROR"
.
Cloud Logging supports both disjunctive restrictions, OR
, and conjunctive
restrictions, AND
. When you use OR
statements, we recommend that you
group the clauses with parentheses.
To create a sink, do the following:
Console
-
In the Google Cloud console, go to the Log Router page:
If you use the search bar to find this page, then select the result whose subheading is Logging.
Select the Google Cloud project in which the log entries that you want to route originate.
For example, if you want to route your Data Access log entries from the project named
Project-A
to a log bucket in the project namedProject-B
, then selectProject-A
.Select Create sink.
In the Sink details panel, enter the following details:
Sink name: Provide an identifier for the sink; note that after you create the sink, you can't rename the sink but you can delete it and create a new sink.
Sink description (optional): Describe the purpose or use case for the sink.
In the Sink destination panel, select the sink service and destination by using the Select sink service menu. Do one of the following:
To route log entries to a service that is in the same Google Cloud project, select one of the following options:
- Cloud Logging bucket: Select or create a Logging bucket.
- BigQuery dataset: Select or create the writeable dataset to receive the routed log entries. You also have the option to use partitioned tables.
- Cloud Storage bucket: Select or create the particular Cloud Storage bucket to receive the routed log entries.
- Pub/Sub topic: Select or create the particular topic to receive the routed log entries.
- Splunk: Select the Pub/Sub topic for your Splunk service.
To route log entries to a different Google Cloud project, select Google Cloud project, and then enter the fully-qualified name for the destination:
logging.googleapis.com/projects/DESTINATION_PROJECT_ID
To route log entries to a service that is in a different Google Cloud project, do the following:
- Select Other resource.
- Enter the fully-qualified name for the destination. For information about the syntax, see the Destination path formats.
Specify the log entries to include:
Go to the Choose logs to include in sink panel.
In the Build inclusion filter field, enter a filter expression that matches the log entries you want to include. To learn more about the syntax for writing filters, see Logging query language.
If you don't set a filter, all log entries from your selected resource are routed to the destination.
For example, to route all Data Access log entries to a Logging bucket, you can use the following filter:
log_id("cloudaudit.googleapis.com/data_access") OR log_id("externalaudit.googleapis.com/data_access")
The length of a filter can't exceed 20,000 characters.
To verify you entered the correct filter, select Preview logs. The Logs Explorer opens in a new tab with the filter pre-populated.
(Optional) Configure an exclusion filter to eliminate some of the included log entries:
Go to the Choose logs to filter out of sink panel.
In the Exclusion filter name field, enter a name.
In the Build an exclusion filter field, enter a filter expression that matches the log entries you want to exclude. You can also use the
sample
function to select a portion of the log entries to exclude.
You can create up to 50 exclusion filters per sink. Note that the length of a filter can't exceed 20,000 characters.
Select Create sink.
Grant the service account for the sink the permission to write log entries to your sink's destination. For more information, see Set destination permissions.
gcloud
To create a sink, do the following:
Run the following
gcloud logging sinks create
command:gcloud logging sinks create SINK_NAME SINK_DESTINATION
Before running the command, make the following replacements:
- SINK_NAME: The name of the log sink. You can't change the name of a sink after you create it.
SINK_DESTINATION: The service or project to where you want your log entries routed. Set SINK_DESTINATION with the appropriate path, as described in Destination path formats.
For example, if your sink destination is a Pub/Sub topic, then SINK_DESTINATION looks like the following:
pubsub.googleapis.com/projects/PROJECT_ID/topics/TOPIC_ID
You can also provide the following options:
--log-filter
: Use this option to set a filter that matches the log entries you want to include in your sink. If you don't provide a value for the inclusion filter, then the this filter matches all log entries.--exclusion
: Use this option to set an exclusion filter for log entries that you want to exclude your sink from routing. You can also use thesample
function to select a portion of the log entries to exclude. This option can be repeated; you can create up to 50 exclusion filters per sink.--description
: Use this option to describe the purpose or use case for the sink.
For example, to create a sink to a Logging bucket, your command might look like this:
gcloud logging sinks create my-sink logging.googleapis.com/projects/myproject123/locations/global/buckets/my-bucket \ --log-filter='logName="projects/myproject123/logs/matched"' --description="My first sink"
For more information on creating sinks using the Google Cloud CLI, see the
gcloud logging sinks
reference.If the command response contains a JSON key labeled
"writerIdentity"
, then grant the service account of the sink the permission to write to the sink destination. For more information, see Set destination permissions.You don't need to set destination permissions when the response doesn't contain a JSON key labeled
"writerIdentity"
.
REST
To create a logging sink in your Google Cloud project, use
projects.sinks.create
in the Logging API. In theLogSink
object, provide the appropriate required values in the method request body:name
: An identifier for the sink. Note that after you create the sink, you can't rename the sink, but you can delete it and create a new sink.destination
: The service and destination to where you want your log entries routed. To route log entries to a different project, or to a destination that is in another project, set thedestination
field with the appropriate path, as described in Destination path formats.For example, if your sink destination is a Pub/Sub topic, then the
destination
looks like the following:pubsub.googleapis.com/projects/PROJECT_ID/topics/TOPIC_ID
In the
LogSink
object, provide the appropriate optional information:filter
: Set thefilter
field to match the log entries you want to include in your sink. If you don't set a filter, all log entries from your Google Cloud project are routed to the destination. Note that the length of a filter can't exceed 20,000 characters.exclusions
: Set this field to match the log entries that you want to exclude from your sink. You can also use thesample
function to select a portion of the log entries to exclude. You can create up to 50 exclusion filters per sink.description
: Set this field to describe the purpose or use case for the sink.
Call
projects.sinks.create
to create the sink.If the API response contains a JSON key labeled
"writerIdentity"
, then grant the service account of the sink the permission to write to the sink destination. For more information, see Set destination permissions.You don't need to set destination permissions when the API response doesn't contain a JSON key labeled
"writerIdentity"
.
For more information on creating sinks using the
Logging API, see the LogSink
reference.
If you receive error notifications, then see Troubleshoot routing and sinks.
Destination path formats
If you route log entries to a service that is in another project, then you must provide the sink with the fully-qualified name for the service. Similarly, if you route log entries to a different Google Cloud project, then you must provide the sink with the fully-qualified name of the destination project:
Cloud Logging log bucket:
logging.googleapis.com/projects/DESTINATION_PROJECT_ID/locations/LOCATION/buckets/BUCKET_NAME
Another Google Cloud project:
logging.googleapis.com/projects/DESTINATION_PROJECT_ID
BigQuery dataset:
bigquery.googleapis.com/projects/PROJECT_ID/datasets/DATASET_ID
Cloud Storage:
storage.googleapis.com/BUCKET_NAME
Pub/Sub topic:
pubsub.googleapis.com/projects/PROJECT_ID/topics/TOPIC_ID
Set destination permissions
This section describes how to grant Logging the Identity and Access Management permissions to write log entries to your sink's destination. For the full list of Logging roles and permissions, see Access control.
Cloud Logging creates a shared service account for a resource when a sink is created, unless the required service account already exists. The service account might exist because the same service account is used for all sinks in the underlying resource. Resources can be a Google Cloud project, an organization, a folder, or a billing account.
The writer identity of a sink is the identifier of the service
account associated with that sink. All sinks have a writer identity except for
sinks that write to a log bucket in the same Google Cloud project in which
the log entry originates. For the latter configuration, a service account
isn't required and therefore the sink's writer identity field
is listed as None
in the console. The
API and the Google Cloud CLI commands don't report a writer identity.
The following instructions apply to projects, folders, organizations, and billing accounts:
Console
Make sure that you have Owner access on the Google Cloud project that contains the destination. If you don't have Owner access to the destination of the sink, then ask a project owner to add the writer identity as a principal.
To get the sink's writer identity—an email address—from the new sink, do the following:
-
In the Google Cloud console, go to the Log Router page:
If you use the search bar to find this page, then select the result whose subheading is Logging.
- In the toolbar, select the project that contains the sink.
- Select more_vert Menu and then select View sink details. The writer identity appears in the Sink details panel.
-
If the value of the
writerIdentity
field contains an email address, then proceed to the next step. When the value isNone
, you don't need to configure destination permissions for the sink.Copy the sink's writer identity into your clipboard.
The email address identifies the principal. The prefix,
serviceAccount:
, specifies the account type.Grant the principal specified in the sink's writer identity the permission to write log data to the destination:
-
In the Google Cloud console, go to the IAM page:
If you use the search bar to find this page, then select the result whose subheading is IAM & Admin.
In the toolbar, make sure that the selected project is either the project that stores the destination or is the sink destination. For example, if the destination is a log bucket, then make sure that the toolbar displays the project that stores the log bucket.
Click
Grant access.Grant the principal specified in the sink's writer identity an IAM role based on the destination of the log sink:
- Google Cloud project: Grant the
Logs Writer role
(
roles/logging.logWriter
). Specifically, a principal needs thelogging.logEntries.route
permission. - Log bucket: Grant the
Logs Bucket Writer role
(
roles/logging.bucketWriter
). - Cloud Storage bucket: Grant the
Storage Object Creator role
(
roles/storage.objectCreator
). - BigQuery dataset: Grant the
BigQuery Data Editor role
(
roles/bigquery.dataEditor
). - Pub/Sub topic, including Splunk: Grant the
Pub/Sub Publisher role
(
roles/pubsub.publisher
).
- Google Cloud project: Grant the
Logs Writer role
(
-
gcloud
Make sure that you have Owner access on the Google Cloud project that contains the destination. If you don't have Owner access to the destination of the sink, then ask a project owner to add the writer identity as a principal.
Get the service account from the
writerIdentity
field in your sink:gcloud logging sinks describe SINK_NAME
Locate the sink whose permissions you want to modify, and if the sink details contain a line with
writerIdentity
, then proceed to the next step. When the details don't include awriterIdentity
field, you don't need to configure destination permissions for the sink.The writer identity for the service account looks similar to the following:
serviceAccount:[email protected]
Grant the sink's writer identity the permission to write log data to the destination by calling the
gcloud projects add-iam-policy-binding
command.Before using the following command, make the following replacements:
- PROJECT_ID: The identifier of the project. Specify the project which stores the destination of the log sink. When the destination is a project, specify that project.
- PRINCIPAL: An identifier for the principal that you want to
grant the role to. Principal identifiers usually have the following form:
PRINCIPAL-TYPE:ID
. For example,user:[email protected]
. For a full list of the formats thatPRINCIPAL
can have, see Principal identifiers. ROLE: An IAM role. Grant the sink's writer identity an IAM role based on the destination of the log sink:
- Google Cloud project: Grant the
Logs Writer role
(
roles/logging.logWriter
). Specifically, a principal needs thelogging.logEntries.route
permission. - Log bucket: Grant the
Logs Bucket Writer role
(
roles/logging.bucketWriter
). - Cloud Storage bucket: Grant the
Storage Object Creator role
(
roles/storage.objectCreator
). - BigQuery dataset: Grant the
BigQuery Data Editor role
(
roles/bigquery.dataEditor
). - Pub/Sub topic, including Splunk: Grant the
Pub/Sub Publisher role
(
roles/pubsub.publisher
).
- Google Cloud project: Grant the
Logs Writer role
(
Execute the
gcloud projects add-iam-policy-binding
command:gcloud projects add-iam-policy-binding PROJECT_ID --member=PRINCIPAL --role=ROLE
REST
We recommend that you use the Google Cloud console or the Google Cloud CLI to grant a role to service account.
Manage sinks
After your sinks are created, you can perform the following actions on them. Any changes made to a sink might take a few minutes to apply:
- View details
- Update
Disable
- You can't disable the
_Required
sink. - You can disable the
_Default
sink to stop it from routing log entries to the_Default
Logging bucket. - If you want to disable the
_Default
sink for any new Google Cloud projects or folders created in your organization, then consider configuring default resource settings.
- You can't disable the
Delete
- You can't delete the
_Default
or the_Required
sinks. - When you delete a sink, it no longer routes log entries.
- If the sink has a dedicated service account, then deleting that sink also deletes the service account. Sinks created before May 22, 2023 have dedicated service accounts. Sinks created on or after May 22, 2023 have a shared service account. Deleting the sink doesn't delete the shared service account.
- You can't delete the
Troubleshoot failures
- View log volume and error rates
Following are the instructions for managing a sink in a Google Cloud project. Instead of a Google Cloud project, you can specify a billing account, folder, or organization:
Console
-
In the Google Cloud console, go to the Log Router page:
If you use the search bar to find this page, then select the result whose subheading is Logging.
In the toolbar, select the resource that contains your sink. The resource can be a project, folder, organization, or billing account.
The Log Router page displays the sinks in the selected resource. Each table row contains information about a sink's properties:
- Enabled: Indicates if the sink's state is enabled or disabled.
- Type: The sink's destination service; for example,
Cloud Logging bucket
. - Name: The sink's identifier, as provided when the sink was created;
for example
_Default
. - Description: The sink's description, as provided when the sink was created.
- Destination: Full name of the destination to which the routed log entries are sent.
- Created: The date and time that the sink was created.
- Last updated: The date and time that the sink was last edited.
- Volume: Reports the total volume of logs routed to the log sink. The value includes the volume routed to log buckets, projects, or to other destinations.
For each table row, the more_vert More actions menu provides the following options:
- View sink details: Displays the sink's name, description, destination service, destination, and inclusion and exclusion filters. Selecting Edit opens the Edit Sink panel.
- Edit sink: Opens the Edit Sink panel where you can update the sink's parameters.
- Disable sink: Lets you disable the sink and stop routing log entries to the sink's destination. For more information on disabling sinks, see Stop storing logs in log buckets.
- Enable sink: Lets you enable a disabled sink and restart routing log entries to the sink's destination.
- Delete sink: Lets you delete the sink and stop routing log entries to the sink's destination.
- Troubleshoot sink: Opens the Logs Explorer where you can troubleshoot errors with the sink.
- View sink log volume and error rates: Opens the Metrics Explorer where you can view and analyze data from the sink.
To sort the table by a column, select the column name.
gcloud
To view your list of sinks for your Google Cloud project, use the
gcloud logging sinks list
command, which corresponds to the Logging API methodprojects.sinks.list
:gcloud logging sinks list
To view your list of aggregated sinks, use the appropriate option to specify the resource that contains the sink. For example, if you created the sink at the organization level, use the
--organization=ORGANIZATION_ID
option to list the sinks for the organization.To describe a sink, use the
gcloud logging sinks describe
command, which corresponds to the Logging API methodprojects.sinks.get
:gcloud logging sinks describe SINK_NAME
To update a sink, use the
gcloud logging sinks update
command, which corresponds to the API methodprojects.sink.update
.You can update a sink to change the destination, filters, and description, or to disable or re-enable the sink:
gcloud logging sinks update SINK_NAME NEW_DESTINATION --log-filter=NEW_FILTER
Omit the NEW_DESTINATION or
--log-filter
if those parts don't change.For example, to update the destination of your sink named
my-project-sink
to a new Cloud Storage bucket destination namedmy-second-gcs-bucket
, your command looks like this:gcloud logging sinks update my-project-sink storage.googleapis.com/my-second-gcs-bucket
To disable a sink, use the
gcloud logging sinks update
command, which corresponds to the API methodprojects.sink.update
, and include the--disabled
option:gcloud logging sinks update SINK_NAME --disabled
To reenable the sink, use the
gcloud logging sinks update
command, remove the--disabled
option, and include the--no-disabled
option:gcloud logging sinks update SINK_NAME --no-disabled
To delete a sink, use the
gcloud logging sinks delete
command, which corresponds to the API methodprojects.sinks.delete
:gcloud logging sinks delete SINK_NAME
For more information on managing sinks using the Google Cloud CLI, see the
gcloud logging sinks
reference.
REST
To view the sinks for your Google Cloud project, call
projects.sinks.list
.To view a sink's details, call
projects.sinks.get
.To update a sink, call
projects.sink.update
.You can update a sink's destination, filters, and description. You can also disable or re-enable the sink.
To disable a sink, set the
disabled
field in theLogSink
object totrue
, and then callprojects.sink.update
.To reenable the sink, set the
disabled
field in theLogSink
object tofalse
, and then callprojects.sink.update
.To delete a sink, call
projects.sinks.delete
.For more information about managing sinks by using the Logging API, see the
LogSink
reference.
Stop storing log entries in log buckets
You can disable the _Default
sink and any user-defined sinks. When you
disable a sink, the sink stops routing log entries to its destination.
For example, if you disable the _Default
sink, then no log entries are
routed to the _Default
bucket. The
_Default
bucket becomes empty when all of the previously stored log entries
have fulfilled the bucket's
retention period.
The following instructions illustrate how to
disable your Google Cloud project sinks that route log entries to the
_Default
log buckets:
Console
-
In the Google Cloud console, go to the Log Router page:
If you use the search bar to find this page, then select the result whose subheading is Logging.
- To find all the sinks that route log entries to the
_Default
log bucket, filter the sinks by destination, and then enter_Default
. For each sink, select more_vert Menu and then select Disable sink.
The sinks are now disabled and your Google Cloud project sinks no longer route log entries to the
_Default
bucket.
To reenable a disabled sink and restart routing log entries to the sink's destination, do the following:
-
In the Google Cloud console, go to the Log Router page:
If you use the search bar to find this page, then select the result whose subheading is Logging.
- To find all the sinks that route log entries to the
_Default
log bucket, filter the sinks by destination, and then enter_Default
. - For each sink, select more_vert Menu and then select Enable sink.
gcloud
To view your list of sinks for your Google Cloud project, use the
gcloud logging sinks list
command, which corresponds to the Logging API methodprojects.sinks.list
:gcloud logging sinks list
Identify any sinks that are routing to the
_Default
log bucket. To describe a sink, including seeing the destination name, use thegcloud logging sinks describe
command, which corresponds to the Logging API methodprojects.sinks.get
:gcloud logging sinks describe SINK_NAME
Run the
gcloud logging sinks update
command and include the--disabled
option. For example, to disable the_Default
sink, use the following command:gcloud logging sinks update _Default --disabled
The
_Default
sink is now disabled; it no longer routes log entries to the_Default
log bucket.
To disable the other sinks in your Google Cloud project that are routing
to the _Default
bucket, repeat the previous steps.
To reenable a sink, use the
gcloud logging sinks update
command, remove the --disabled
option, and include the --no-disabled
option:
gcloud logging sinks update _Default --no-disabled
REST
To view the sinks for your Google Cloud project, call the Logging API method
projects.sinks.list
.Identify any sinks that are routing to the
_Default
bucket.For example, to disable the
_Default
sink, set thedisabled
field in theLogSink
object totrue
, and then callprojects.sink.update
.The
_Default
sink is now disabled; it no longer routes log entries to the_Default
bucket.
To disable the other sinks in your Google Cloud project that are routing
to the _Default
bucket, repeat the previous steps.
To reenable a sink,
set the disabled
field in the LogSink
object to false
, and then
call projects.sink.update
.
Code samples
To use client library code to configure sinks in your chosen languages, see Logging client libraries: Log sinks.
Filter examples
Following are some filter examples that are particularly useful when creating sinks. For additional examples that might be useful as you build your inclusion filters and exclusion filters, see Sample queries.
Restore the _Default
sink filter
If you edited the filter for the _Default
sink, then you might want to restore
this sink to its original configuration. When created, the _Default
sink is
configured with the following inclusion filter and an empty exclusion filter:
NOT log_id("cloudaudit.googleapis.com/activity") AND NOT \
log_id("externalaudit.googleapis.com/activity") AND NOT \
log_id("cloudaudit.googleapis.com/system_event") AND NOT \
log_id("externalaudit.googleapis.com/system_event") AND NOT \
log_id("cloudaudit.googleapis.com/access_transparency") AND NOT \
log_id("externalaudit.googleapis.com/access_transparency")
Exclude Google Kubernetes Engine container and pod logs
To exclude Google Kubernetes Engine container and pod log entries for
GKE system namespaces
, use the following filter:
resource.type = ("k8s_container" OR "k8s_pod")
resource.labels.namespace_name = (
"cnrm-system" OR
"config-management-system" OR
"gatekeeper-system" OR
"gke-connect" OR
"gke-system" OR
"istio-system" OR
"knative-serving" OR
"monitoring-system" OR
"kube-system")
To exclude Google Kubernetes Engine node log entries for GKE
system logNames
, use the following filter:
resource.type = "k8s_node"
logName:( "logs/container-runtime" OR
"logs/docker" OR
"logs/kube-container-runtime-monitor" OR
"logs/kube-logrotate" OR
"logs/kube-node-configuration" OR
"logs/kube-node-installation" OR
"logs/kubelet" OR
"logs/kubelet-monitor" OR
"logs/node-journal" OR
"logs/node-problem-detector")
To view the volume of Google Kubernetes Engine node, pod, and container log entries stored in log buckets, use Metrics Explorer:
Exclude Dataflow logs not required for supportability
To exclude Dataflow log entries that aren't required for supportability, use the following filter:
resource.type="dataflow_step"
labels."dataflow.googleapis.com/log_type"!="system" AND labels."dataflow.googleapis.com/log_type"!="supportability"
To view the volume of Dataflow logs stored in log buckets, use Metrics Explorer.
Supportability
Although Cloud Logging lets you exclude log entries and prevent them from being stored in a log bucket, you might want to consider keeping log entries that help with supportability. Using these log entries can help you troubleshoot and identify issues with your applications.
For example, GKE system log entries are useful to troubleshoot your GKE applications and clusters because they are generated for events that happen in your cluster. These log entries can help you determine if your application code or the underlying GKE cluster is causing your application error. GKE system logs also include Kubernetes Audit Logging generated by the Kubernetes API Server component, which includes changes made using the kubectl command and Kubernetes events.
For Dataflow, we recommended that you, at a minimum, write your system
logs (labels."dataflow.googleapis.com/log_type"="system"
) and supportability
logs (labels."dataflow.googleapis.com/log_type"="supportability"
) to
log buckets. These logs
are essential for developers to observe and troubleshoot their Dataflow
pipelines, and users might not be able to use the Dataflow
Job details page to view job logs.
What's next
If you encounter issues as you use sinks to route log entries, see Troubleshoot routing logs.
To learn how to view log entries in their destinations, as well as how the logs are formatted and organized, see View logs in sink destinations.
To learn more about querying and filtering with the Logging query language, see Logging query language.