Script 'mail_helper' called by obssrc
Hello community,

here is the log from the commit of package oci-cli for openSUSE:Factory checked 
in at 2024-03-05 18:49:39
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/oci-cli (Old)
 and      /work/SRC/openSUSE:Factory/.oci-cli.new.1770 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Package is "oci-cli"

Tue Mar  5 18:49:39 2024 rev:56 rq:1147595 version:3.37.9

Changes:
--------
--- /work/SRC/openSUSE:Factory/oci-cli/oci-cli.changes  2024-02-09 
23:55:07.853041430 +0100
+++ /work/SRC/openSUSE:Factory/.oci-cli.new.1770/oci-cli.changes        
2024-03-05 18:49:48.875197008 +0100
@@ -1,0 +2,39 @@
+Fri Feb 16 13:30:57 UTC 2024 - John Paul Adrian Glaubitz 
<adrian.glaub...@suse.com>
+
+- Update to version 3.37.9
+  * Support for new optional parameter isReplicateAutomaticBackups
+    in the Database Service
+    * ``oci db autonomous-database change-disaster-recovery-configuration
+      --is-replicate-automatic-backups``
+    * ``oci db autonomous-database create-autonomous-database-create-cross-\
+      region-disaster-recovery-details --is-replicate-automatic-backups``
+  * Loganalytics service
+    * Support for additional attributes in entity and topology
+      * ``oci log-analytics entity create --metadata, --time-last-discovered``
+      * ``oci log-analytics entity list --metadata-equals``
+      * ``oci log-analytics entity update --metadata, --time-last-discovered``
+      * ``oci log-analytics entity upload-discovery-data --log-group-id``
+      * ``oci log-analytics entity-topology list --metadata-equals``
+    * Support for historic collection and log type while creating object 
collection rule
+      * ``oci log-analytics object-collection-rule create 
--is-force-historic-collection, --log-type``
+    * Support for position aware parsers
+      * ``oci log-analytics parser extract-structured-log-field-paths 
--is-position-aware``
+      * ``oci log-analytics parser extract-structured-log-header-paths 
--is-position-aware``
+      * ``oci log-analytics parser test-parser --is-position-aware``
+      * ``oci log-analytics parser upsert-parser --is-position-aware``
+    * Support for filtering detection rules based on target service
+      * ``oci log-analytics rule list --target-service``
+    * Support for filtering scheduled tasks based on target service
+      * ``oci log-analytics scheduled-task list --target-service``
+    * Support for filtering log sources based on their type
+      * ``oci log-analytics source list-sources --source-type``
+    * Support for additional recall and release attributes
+      * ``oci log-analytics storage recall-archived-data 
--is-use-recommended-data-set``
+      * ``oci log-analytics storage release-recalled-data --collection-id``
+    * Support for opc-meta-properties header while uploading log events
+      * ``oci log-analytics upload upload-log-events-file 
--opc-meta-properties``
+- Refresh patches for new version
+  * oc_relax-python-depends.patch
+- Update BuildRequires and Requires from setup.py
+
+-------------------------------------------------------------------

Old:
----
  oci-cli-3.37.8.tar.gz

New:
----
  oci-cli-3.37.9.tar.gz

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Other differences:
------------------
++++++ oci-cli.spec ++++++
--- /var/tmp/diff_new_pack.Y2FhkL/_old  2024-03-05 18:49:50.975273258 +0100
+++ /var/tmp/diff_new_pack.Y2FhkL/_new  2024-03-05 18:49:50.991273839 +0100
@@ -28,7 +28,7 @@
 %bcond_with test
 %endif
 Name:           oci-cli%{psuffix}
-Version:        3.37.8
+Version:        3.37.9
 Release:        0
 Summary:        Oracle Cloud Infrastructure CLI
 License:        Apache-2.0
@@ -46,7 +46,7 @@
 BuildRequires:  python3-cryptography >= 3.2.1
 BuildRequires:  python3-devel
 BuildRequires:  python3-jmespath >= 0.10.0
-BuildRequires:  python3-oci-sdk >= 2.121.0
+BuildRequires:  python3-oci-sdk >= 2.121.1
 BuildRequires:  python3-pyOpenSSL >= 22.1.0
 BuildRequires:  python3-python-dateutil >= 2.5.3
 BuildRequires:  python3-pytz >= 2016.10
@@ -85,7 +85,7 @@
 Requires:       python3-click >= 8.0.4
 Requires:       python3-cryptography >= 3.2.1
 Requires:       python3-jmespath >= 0.10.0
-Requires:       python3-oci-sdk >= 2.121.0
+Requires:       python3-oci-sdk >= 2.121.1
 Requires:       python3-prompt_toolkit >= 3.0.29
 Requires:       python3-pyOpenSSL >= 22.1.0
 Requires:       python3-python-dateutil >= 2.5.3

++++++ oc_relax-python-depends.patch ++++++
--- /var/tmp/diff_new_pack.Y2FhkL/_old  2024-03-05 18:49:51.231282554 +0100
+++ /var/tmp/diff_new_pack.Y2FhkL/_new  2024-03-05 18:49:51.263283715 +0100
@@ -1,6 +1,6 @@
-diff -Nru oci-cli-3.37.8.orig/requirements.txt oci-cli-3.37.8/requirements.txt
---- oci-cli-3.37.8.orig/requirements.txt       2024-02-06 07:35:21.000000000 
+0100
-+++ oci-cli-3.37.8/requirements.txt    2024-02-08 10:28:27.347609032 +0100
+diff -Nru oci-cli-3.37.9.orig/requirements.txt oci-cli-3.37.9/requirements.txt
+--- oci-cli-3.37.9.orig/requirements.txt       2024-02-13 11:20:04.000000000 
+0100
++++ oci-cli-3.37.9/requirements.txt    2024-02-16 14:28:54.514405130 +0100
 @@ -2,47 +2,47 @@
  # 
(https://pip.pypa.io/en/stable/reference/pip_install/#requirements-file-format),
  # you may need to use the --extra-index-url option instead.
@@ -18,7 +18,7 @@
 -jmespath==0.10.0
 -ndg-httpsclient==0.4.2
 -mock==2.0.0
--oci==2.121.0
+-oci==2.121.1
 -packaging==20.2
 -pluggy==0.13.0
 -py==1.11.0
@@ -38,7 +38,7 @@
 +jmespath>=0.10.0
 +ndg-httpsclient>=0.4.2
 +mock>=2.0.0
-+oci>=2.121.0
++oci>=2.121.1
 +packaging>=20.2
 +pluggy>=0.13.0
 +py>=1.11.0
@@ -53,7 +53,7 @@
  pytz>=2016.10
 -requests==2.21.0; python_version == '3.6'
 -requests==2.31.0; python_version > '3.6'
-+requests>=2.21.0; python_version >= '3.6'
++requests>=2.21.0; python_version == '3.6'
 +requests>=2.31.0; python_version > '3.6'
  six>=1.15.0
 -sphinx==3.3.0
@@ -82,18 +82,18 @@
 +prompt-toolkit>=3.0.29
  setuptools>65.5.1; python_version > '3.6'
 -setuptools==59.6.0; python_version == '3.6'
-+setuptools>=59.6.0; python_version >= '3.6'
++setuptools>=59.6.0; python_version == '3.6'
  # this is required because of python 3.6 requests dependency version bound
  urllib3<=1.26.15
-diff -Nru oci-cli-3.37.8.orig/setup.py oci-cli-3.37.8/setup.py
---- oci-cli-3.37.8.orig/setup.py       2024-02-06 07:35:21.000000000 +0100
-+++ oci-cli-3.37.8/setup.py    2024-02-08 10:27:25.770560495 +0100
+diff -Nru oci-cli-3.37.9.orig/setup.py oci-cli-3.37.9/setup.py
+--- oci-cli-3.37.9.orig/setup.py       2024-02-13 11:20:04.000000000 +0100
++++ oci-cli-3.37.9/setup.py    2024-02-16 14:26:29.467531027 +0100
 @@ -30,23 +30,23 @@
      readme = f.read()
  
  requires = [
--    'oci==2.121.0',
-+    'oci>=2.121.0',
+-    'oci==2.121.1',
++    'oci>=2.121.1',
      'arrow>=1.0.0',
      'certifi',
 -    'click==8.0.4',

++++++ oci-cli-3.37.8.tar.gz -> oci-cli-3.37.9.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/oci-cli-3.37.8/CHANGELOG.rst 
new/oci-cli-3.37.9/CHANGELOG.rst
--- old/oci-cli-3.37.8/CHANGELOG.rst    2024-02-06 07:35:21.000000000 +0100
+++ new/oci-cli-3.37.9/CHANGELOG.rst    2024-02-13 11:20:04.000000000 +0100
@@ -6,6 +6,60 @@
 
 The format is based on `Keep a Changelog <http://keepachangelog.com/>`__.
 
+3.37.9 - 2024-02-13
+-------------------
+Added
+~~~~~
+
+* Support for new optional parameter isReplicateAutomaticBackups in the 
Database Service
+
+  * ``oci db autonomous-database change-disaster-recovery-configuration 
--is-replicate-automatic-backups``
+  * ``oci db autonomous-database 
create-autonomous-database-create-cross-region-disaster-recovery-details 
--is-replicate-automatic-backups``
+ 
+Changed
+~~~~~~~
+* Loganalytics service
+
+  * Support for additional attributes in entity and topology
+
+    * ``oci log-analytics entity create --metadata, --time-last-discovered``
+    * ``oci log-analytics entity list --metadata-equals``
+    * ``oci log-analytics entity update --metadata, --time-last-discovered``
+    * ``oci log-analytics entity upload-discovery-data --log-group-id``
+    * ``oci log-analytics entity-topology list --metadata-equals``
+
+  * Support for historic collection and log type while creating object 
collection rule
+
+    * ``oci log-analytics object-collection-rule create 
--is-force-historic-collection, --log-type``
+
+  * Support for position aware parsers
+
+    * ``oci log-analytics parser extract-structured-log-field-paths 
--is-position-aware``
+    * ``oci log-analytics parser extract-structured-log-header-paths 
--is-position-aware``
+    * ``oci log-analytics parser test-parser --is-position-aware``
+    * ``oci log-analytics parser upsert-parser --is-position-aware``
+
+  * Support for filtering detection rules based on target service
+
+    * ``oci log-analytics rule list --target-service``
+
+  * Support for filtering scheduled tasks based on target service
+
+    * ``oci log-analytics scheduled-task list --target-service``
+
+  * Support for filtering log sources based on their type
+
+    * ``oci log-analytics source list-sources --source-type``
+
+  * Support for additional recall and release attributes
+
+    * ``oci log-analytics storage recall-archived-data 
--is-use-recommended-data-set``
+    * ``oci log-analytics storage release-recalled-data --collection-id``
+
+  * Support for opc-meta-properties header while uploading log events
+
+    * ``oci log-analytics upload upload-log-events-file --opc-meta-properties``
+
 3.37.8 - 2024-02-06
 --------------------
 Added
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/oci-cli-3.37.8/requirements.txt 
new/oci-cli-3.37.9/requirements.txt
--- old/oci-cli-3.37.8/requirements.txt 2024-02-06 07:35:21.000000000 +0100
+++ new/oci-cli-3.37.9/requirements.txt 2024-02-13 11:20:04.000000000 +0100
@@ -14,7 +14,7 @@
 jmespath==0.10.0
 ndg-httpsclient==0.4.2
 mock==2.0.0
-oci==2.121.0
+oci==2.121.1
 packaging==20.2
 pluggy==0.13.0
 py==1.11.0
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/oci-cli-3.37.8/services/ai_language/src/oci_cli_ai_service_language/aiservicelanguage_cli_extended.py
 
new/oci-cli-3.37.9/services/ai_language/src/oci_cli_ai_service_language/aiservicelanguage_cli_extended.py
--- 
old/oci-cli-3.37.8/services/ai_language/src/oci_cli_ai_service_language/aiservicelanguage_cli_extended.py
   2024-02-06 07:35:21.000000000 +0100
+++ 
new/oci-cli-3.37.9/services/ai_language/src/oci_cli_ai_service_language/aiservicelanguage_cli_extended.py
   2024-02-13 11:20:04.000000000 +0100
@@ -93,7 +93,6 @@
 
aiservicelanguage_cli.model_group.commands.pop(aiservicelanguage_cli.create_model_pre_trained_named_entity_recognition_model_details.name)
 
aiservicelanguage_cli.model_group.commands.pop(aiservicelanguage_cli.create_model_pre_trained_language_detection_model_details.name)
 
aiservicelanguage_cli.model_group.commands.pop(aiservicelanguage_cli.create_model_pre_trained_sentiment_analysis_model_details.name)
-aiservicelanguage_cli.model_group.commands.pop(aiservicelanguage_cli.create_model_pre_trained_phi_model_details.name)
 
aiservicelanguage_cli.model_group.commands.pop(aiservicelanguage_cli.create_model_pre_trained_text_classification_model_details.name)
 
aiservicelanguage_cli.model_group.commands.pop(aiservicelanguage_cli.create_model_pre_trained_summarization.name)
 
aiservicelanguage_cli.model_group.commands.pop(aiservicelanguage_cli.create_model_pre_trained_pii_model_details.name)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/oci-cli-3.37.8/services/database/src/oci_cli_database/generated/database_cli.py
 
new/oci-cli-3.37.9/services/database/src/oci_cli_database/generated/database_cli.py
--- 
old/oci-cli-3.37.8/services/database/src/oci_cli_database/generated/database_cli.py
 2024-02-06 07:35:21.000000000 +0100
+++ 
new/oci-cli-3.37.9/services/database/src/oci_cli_database/generated/database_cli.py
 2024-02-13 11:20:04.000000000 +0100
@@ -1637,6 +1637,7 @@
 @cli_util.option('--disaster-recovery-type', 
type=custom_types.CliCaseInsensitiveChoice(["ADG", "BACKUP_BASED"]), 
help=u"""Indicates the disaster recovery (DR) type of the Autonomous Database 
Serverless instance. Autonomous Data Guard (ADG) DR type provides business 
critical DR with a faster recovery time objective (RTO) during failover or 
switchover. Backup-based DR type provides lower cost DR with a slower RTO 
during failover or switchover.""")
 @cli_util.option('--time-snapshot-standby-enabled-till', 
type=custom_types.CLI_DATETIME, help=u"""Time and date stored as an RFC 3339 
formatted timestamp string. For example, 2022-01-01T12:00:00.000Z would set a 
limit for the snapshot standby to be converted back to a cross-region standby 
database.""" + custom_types.CLI_DATETIME.VALID_DATETIME_CLI_HELP_MESSAGE)
 @cli_util.option('--is-snapshot-standby', type=click.BOOL, help=u"""Indicates 
if user wants to convert to a snapshot standby. For example, true would set a 
standby database to snapshot standby database. False would set a snapshot 
standby database back to regular standby database.""")
+@cli_util.option('--is-replicate-automatic-backups', type=click.BOOL, 
help=u"""If true, 7 days worth of backups are replicated across regions for 
Cross-Region ADB or Backup-Based DR between Primary and Standby. If false, the 
backups taken on the Primary are not replicated to the Standby database.""")
 @cli_util.option('--if-match', help=u"""For optimistic concurrency control. In 
the PUT or DELETE call for a resource, set the `if-match` parameter to the 
value of the etag from a previous GET or POST response for that resource.  The 
resource will be updated or deleted only if the etag you provide matches the 
resource's current etag value.""")
 @cli_util.option('--wait-for-state', 
type=custom_types.CliCaseInsensitiveChoice(["PROVISIONING", "AVAILABLE", 
"STOPPING", "STOPPED", "STARTING", "TERMINATING", "TERMINATED", "UNAVAILABLE", 
"RESTORE_IN_PROGRESS", "RESTORE_FAILED", "BACKUP_IN_PROGRESS", 
"SCALE_IN_PROGRESS", "AVAILABLE_NEEDS_ATTENTION", "UPDATING", 
"MAINTENANCE_IN_PROGRESS", "RESTARTING", "RECREATING", 
"ROLE_CHANGE_IN_PROGRESS", "UPGRADING", "INACCESSIBLE", "STANDBY"]), 
multiple=True, help="""This operation creates, modifies or deletes a resource 
that has a defined lifecycle state. Specify this option to perform the action 
and then wait until the resource reaches a given lifecycle state. Multiple 
states can be specified, returning on the first state. For example, 
--wait-for-state SUCCEEDED --wait-for-state FAILED would return on whichever 
lifecycle state is reached first. If timeout is reached, a return code of 2 is 
returned. For any other error, a return code of 1 is returned.""")
 @cli_util.option('--max-wait-seconds', type=click.INT, help="""The maximum 
time to wait for the resource to reach the lifecycle state defined by 
--wait-for-state. Defaults to 1200 seconds.""")
@@ -1646,7 +1647,7 @@
 @click.pass_context
 
@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={},
 output_type={'module': 'database', 'class': 'AutonomousDatabase'})
 @cli_util.wrap_exceptions
-def change_disaster_recovery_configuration(ctx, from_json, wait_for_state, 
max_wait_seconds, wait_interval_seconds, autonomous_database_id, 
disaster_recovery_type, time_snapshot_standby_enabled_till, 
is_snapshot_standby, if_match):
+def change_disaster_recovery_configuration(ctx, from_json, wait_for_state, 
max_wait_seconds, wait_interval_seconds, autonomous_database_id, 
disaster_recovery_type, time_snapshot_standby_enabled_till, 
is_snapshot_standby, is_replicate_automatic_backups, if_match):
 
     if isinstance(autonomous_database_id, six.string_types) and 
len(autonomous_database_id.strip()) == 0:
         raise click.UsageError('Parameter --autonomous-database-id cannot be 
whitespace or empty string')
@@ -1667,6 +1668,9 @@
     if is_snapshot_standby is not None:
         _details['isSnapshotStandby'] = is_snapshot_standby
 
+    if is_replicate_automatic_backups is not None:
+        _details['isReplicateAutomaticBackups'] = 
is_replicate_automatic_backups
+
     client = cli_util.build_client('database', 'database', ctx)
     result = client.change_disaster_recovery_configuration(
         autonomous_database_id=autonomous_database_id,
@@ -4389,6 +4393,7 @@
 
 This cannot be used in conjunction with adminPassword.""")
 @cli_util.option('--secret-version-number', type=click.INT, help=u"""The 
version of the vault secret. If no version is specified, the latest version 
will be used.""")
+@cli_util.option('--is-replicate-automatic-backups', type=click.BOOL, 
help=u"""If true, 7 days worth of backups are replicated across regions for 
Cross-Region ADB or Backup-Based DR between Primary and Standby. If false, the 
backups taken on the Primary are not replicated to the Standby database.""")
 @cli_util.option('--wait-for-state', 
type=custom_types.CliCaseInsensitiveChoice(["PROVISIONING", "AVAILABLE", 
"STOPPING", "STOPPED", "STARTING", "TERMINATING", "TERMINATED", "UNAVAILABLE", 
"RESTORE_IN_PROGRESS", "RESTORE_FAILED", "BACKUP_IN_PROGRESS", 
"SCALE_IN_PROGRESS", "AVAILABLE_NEEDS_ATTENTION", "UPDATING", 
"MAINTENANCE_IN_PROGRESS", "RESTARTING", "RECREATING", 
"ROLE_CHANGE_IN_PROGRESS", "UPGRADING", "INACCESSIBLE", "STANDBY"]), 
multiple=True, help="""This operation creates, modifies or deletes a resource 
that has a defined lifecycle state. Specify this option to perform the action 
and then wait until the resource reaches a given lifecycle state. Multiple 
states can be specified, returning on the first state. For example, 
--wait-for-state SUCCEEDED --wait-for-state FAILED would return on whichever 
lifecycle state is reached first. If timeout is reached, a return code of 2 is 
returned. For any other error, a return code of 1 is returned.""")
 @cli_util.option('--max-wait-seconds', type=click.INT, help="""The maximum 
time to wait for the resource to reach the lifecycle state defined by 
--wait-for-state. Defaults to 1200 seconds.""")
 @cli_util.option('--wait-interval-seconds', type=click.INT, help="""Check 
every --wait-interval-seconds to see whether the resource has reached the 
lifecycle state defined by --wait-for-state. Defaults to 30 seconds.""")
@@ -4397,7 +4402,7 @@
 @click.pass_context
 
@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={'whitelisted-ips':
 {'module': 'database', 'class': 'list[string]'}, 'standby-whitelisted-ips': 
{'module': 'database', 'class': 'list[string]'}, 'nsg-ids': {'module': 
'database', 'class': 'list[string]'}, 'freeform-tags': {'module': 'database', 
'class': 'dict(str, string)'}, 'defined-tags': {'module': 'database', 'class': 
'dict(str, dict(str, object))'}, 'customer-contacts': {'module': 'database', 
'class': 'list[CustomerContact]'}, 'resource-pool-summary': {'module': 
'database', 'class': 'ResourcePoolSummary'}, 'scheduled-operations': {'module': 
'database', 'class': 'list[ScheduledOperationDetails]'}, 'db-tools-details': 
{'module': 'database', 'class': 'list[DatabaseTool]'}}, output_type={'module': 
'database', 'class': 'AutonomousDatabase'})
 @cli_util.wrap_exceptions
-def 
create_autonomous_database_create_cross_region_disaster_recovery_details(ctx, 
from_json, wait_for_state, max_wait_seconds, wait_interval_seconds, 
compartment_id, source_id, remote_disaster_recovery_type, character_set, 
ncharacter_set, db_name, cpu_core_count, backup_retention_period_in_days, 
compute_model, compute_count, ocpu_count, db_workload, 
data_storage_size_in_tbs, data_storage_size_in_gbs, is_free_tier, kms_key_id, 
vault_id, admin_password, display_name, license_model, 
is_preview_version_with_service_terms_accepted, is_auto_scaling_enabled, 
is_dedicated, autonomous_container_database_id, in_memory_percentage, 
is_access_control_enabled, whitelisted_ips, are_primary_whitelisted_ips_used, 
standby_whitelisted_ips, is_data_guard_enabled, is_local_data_guard_enabled, 
subnet_id, nsg_ids, private_endpoint_label, freeform_tags, defined_tags, 
private_endpoint_ip, db_version, customer_contacts, 
is_mtls_connection_required, resource_pool_leader_id, resource_pool_summary, 
autonomous_m
 aintenance_schedule_type, scheduled_operations, 
is_auto_scaling_for_storage_enabled, max_cpu_core_count, database_edition, 
db_tools_details, secret_id, secret_version_number):
+def 
create_autonomous_database_create_cross_region_disaster_recovery_details(ctx, 
from_json, wait_for_state, max_wait_seconds, wait_interval_seconds, 
compartment_id, source_id, remote_disaster_recovery_type, character_set, 
ncharacter_set, db_name, cpu_core_count, backup_retention_period_in_days, 
compute_model, compute_count, ocpu_count, db_workload, 
data_storage_size_in_tbs, data_storage_size_in_gbs, is_free_tier, kms_key_id, 
vault_id, admin_password, display_name, license_model, 
is_preview_version_with_service_terms_accepted, is_auto_scaling_enabled, 
is_dedicated, autonomous_container_database_id, in_memory_percentage, 
is_access_control_enabled, whitelisted_ips, are_primary_whitelisted_ips_used, 
standby_whitelisted_ips, is_data_guard_enabled, is_local_data_guard_enabled, 
subnet_id, nsg_ids, private_endpoint_label, freeform_tags, defined_tags, 
private_endpoint_ip, db_version, customer_contacts, 
is_mtls_connection_required, resource_pool_leader_id, resource_pool_summary, 
autonomous_m
 aintenance_schedule_type, scheduled_operations, 
is_auto_scaling_for_storage_enabled, max_cpu_core_count, database_edition, 
db_tools_details, secret_id, secret_version_number, 
is_replicate_automatic_backups):
 
     kwargs = {}
     kwargs['opc_request_id'] = 
cli_util.use_or_generate_request_id(ctx.obj['request_id'])
@@ -4548,6 +4553,9 @@
     if secret_version_number is not None:
         _details['secretVersionNumber'] = secret_version_number
 
+    if is_replicate_automatic_backups is not None:
+        _details['isReplicateAutomaticBackups'] = 
is_replicate_automatic_backups
+
     _details['source'] = 'CROSS_REGION_DISASTER_RECOVERY'
 
     client = cli_util.build_client('database', 'database', ctx)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/oci-cli-3.37.8/services/generative_ai_inference/src/oci_cli_generative_ai_inference/generative_ai_inference_cli_extended.py
 
new/oci-cli-3.37.9/services/generative_ai_inference/src/oci_cli_generative_ai_inference/generative_ai_inference_cli_extended.py
--- 
old/oci-cli-3.37.8/services/generative_ai_inference/src/oci_cli_generative_ai_inference/generative_ai_inference_cli_extended.py
     2024-02-06 07:35:21.000000000 +0100
+++ 
new/oci-cli-3.37.9/services/generative_ai_inference/src/oci_cli_generative_ai_inference/generative_ai_inference_cli_extended.py
     2024-02-13 11:20:04.000000000 +0100
@@ -2,6 +2,11 @@
 # Copyright (c) 2016, 2021, Oracle and/or its affiliates.  All rights reserved.
 # This software is dual-licensed to you under the Universal Permissive License 
(UPL) 1.0 as shown at https://oss.oracle.com/licenses/upl or Apache License 2.0 
as shown at http://www.apache.org/licenses/LICENSE-2.0. You may choose either 
license.
 
+import click  # noqa: F401
+import json  # noqa: F401
+from oci_cli import cli_util  # noqa: F401
+from oci_cli import custom_types  # noqa: F401
+from oci_cli import json_skeleton_utils  # noqa: F401
 from 
services.generative_ai_inference.src.oci_cli_generative_ai_inference.generated 
import generativeaiinference_cli
 
 # Remove oci generative-ai-inference deprecated commands:
@@ -15,3 +20,27 @@
 # EmbedText API: embed_text_on_demand_serving_mode, 
embed_text_dedicated_serving_mode
 
generativeaiinference_cli.embed_text_result_group.commands.pop(generativeaiinference_cli.embed_text_on_demand_serving_mode.name)
 
generativeaiinference_cli.embed_text_result_group.commands.pop(generativeaiinference_cli.embed_text_dedicated_serving_mode.name)
+
+
+@cli_util.copy_params_from_generated_command(generativeaiinference_cli.generate_text_cohere_llm_inference_request,
 params_to_exclude=['inference_request_is_stream'])
+@generativeaiinference_cli.generate_text_result_group.command(name='generate-text-cohere-llm-inference-request',
 help=generativeaiinference_cli.generate_text_cohere_llm_inference_request.help)
+@click.pass_context
+@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={'serving-mode':
 {'module': 'generative_ai_inference', 'class': 'ServingMode'}, 
'inference-request-stop-sequences': {'module': 'generative_ai_inference', 
'class': 'list[string]'}}, output_type={'module': 'generative_ai_inference', 
'class': 'GenerateTextResult'})
+@cli_util.wrap_exceptions
+def generate_text_cohere_llm_inference_request_extended(ctx, **kwargs):
+
+    kwargs['inference_request_is_stream'] = False
+
+    
ctx.invoke(generativeaiinference_cli.generate_text_cohere_llm_inference_request,
 **kwargs)
+
+
+@cli_util.copy_params_from_generated_command(generativeaiinference_cli.generate_text_llama_llm_inference_request,
 params_to_exclude=['inference_request_is_stream'])
+@generativeaiinference_cli.generate_text_result_group.command(name='generate-text-llama-llm-inference-request',
 help=generativeaiinference_cli.generate_text_llama_llm_inference_request.help)
+@click.pass_context
+@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={'serving-mode':
 {'module': 'generative_ai_inference', 'class': 'ServingMode'}, 
'inference-request-stop': {'module': 'generative_ai_inference', 'class': 
'list[string]'}}, output_type={'module': 'generative_ai_inference', 'class': 
'GenerateTextResult'})
+@cli_util.wrap_exceptions
+def generate_text_llama_llm_inference_request_extended(ctx, **kwargs):
+
+    kwargs['inference_request_is_stream'] = False
+
+    
ctx.invoke(generativeaiinference_cli.generate_text_llama_llm_inference_request, 
**kwargs)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/oci-cli-3.37.8/services/globally_distributed_database/src/oci_cli_sharded_database_service/generated/shardeddatabaseservice_cli.py
 
new/oci-cli-3.37.9/services/globally_distributed_database/src/oci_cli_sharded_database_service/generated/shardeddatabaseservice_cli.py
--- 
old/oci-cli-3.37.8/services/globally_distributed_database/src/oci_cli_sharded_database_service/generated/shardeddatabaseservice_cli.py
      2024-02-06 07:35:21.000000000 +0100
+++ 
new/oci-cli-3.37.9/services/globally_distributed_database/src/oci_cli_sharded_database_service/generated/shardeddatabaseservice_cli.py
      2024-02-13 11:20:04.000000000 +0100
@@ -113,6 +113,10 @@
                     wait_period_kwargs['max_wait_seconds'] = max_wait_seconds
                 if wait_interval_seconds is not None:
                     wait_period_kwargs['max_interval_seconds'] = 
wait_interval_seconds
+                if 'opc-work-request-id' not in result.headers:
+                    click.echo('Encountered error while waiting for work 
request to enter the specified state. Outputting last known resource state')
+                    cli_util.render_response(result, ctx)
+                    return
 
                 click.echo('Action completed. Waiting until the work request 
has entered state: {}'.format(wait_for_state), file=sys.stderr)
                 result = oci.wait_until(client, 
client.get_work_request(result.headers['opc-work-request-id']), 'status', 
wait_for_state, **wait_period_kwargs)
@@ -170,6 +174,10 @@
                     wait_period_kwargs['max_wait_seconds'] = max_wait_seconds
                 if wait_interval_seconds is not None:
                     wait_period_kwargs['max_interval_seconds'] = 
wait_interval_seconds
+                if 'opc-work-request-id' not in result.headers:
+                    click.echo('Encountered error while waiting for work 
request to enter the specified state. Outputting last known resource state')
+                    cli_util.render_response(result, ctx)
+                    return
 
                 click.echo('Action completed. Waiting until the work request 
has entered state: {}'.format(wait_for_state), file=sys.stderr)
                 result = oci.wait_until(client, 
client.get_work_request(result.headers['opc-work-request-id']), 'status', 
wait_for_state, **wait_period_kwargs)
@@ -229,6 +237,10 @@
                     wait_period_kwargs['max_wait_seconds'] = max_wait_seconds
                 if wait_interval_seconds is not None:
                     wait_period_kwargs['max_interval_seconds'] = 
wait_interval_seconds
+                if 'opc-work-request-id' not in result.headers:
+                    click.echo('Encountered error while waiting for work 
request to enter the specified state. Outputting last known resource state')
+                    cli_util.render_response(result, ctx)
+                    return
 
                 click.echo('Action completed. Waiting until the work request 
has entered state: {}'.format(wait_for_state), file=sys.stderr)
                 result = oci.wait_until(client, 
client.get_work_request(result.headers['opc-work-request-id']), 'status', 
wait_for_state, **wait_period_kwargs)
@@ -283,6 +295,10 @@
                     wait_period_kwargs['max_wait_seconds'] = max_wait_seconds
                 if wait_interval_seconds is not None:
                     wait_period_kwargs['max_interval_seconds'] = 
wait_interval_seconds
+                if 'opc-work-request-id' not in result.headers:
+                    click.echo('Encountered error while waiting for work 
request to enter the specified state. Outputting last known resource state')
+                    cli_util.render_response(result, ctx)
+                    return
 
                 click.echo('Action completed. Waiting until the work request 
has entered state: {}'.format(wait_for_state), file=sys.stderr)
                 result = oci.wait_until(client, 
client.get_work_request(result.headers['opc-work-request-id']), 'status', 
wait_for_state, **wait_period_kwargs)
@@ -352,6 +368,10 @@
                     wait_period_kwargs['max_wait_seconds'] = max_wait_seconds
                 if wait_interval_seconds is not None:
                     wait_period_kwargs['max_interval_seconds'] = 
wait_interval_seconds
+                if 'opc-work-request-id' not in result.headers:
+                    click.echo('Encountered error while waiting for work 
request to enter the specified state. Outputting last known resource state')
+                    cli_util.render_response(result, ctx)
+                    return
 
                 click.echo('Action completed. Waiting until the work request 
has entered state: {}'.format(wait_for_state), file=sys.stderr)
                 result = oci.wait_until(client, 
client.get_work_request(result.headers['opc-work-request-id']), 'status', 
wait_for_state, **wait_period_kwargs)
@@ -413,6 +433,10 @@
                     wait_period_kwargs['max_wait_seconds'] = max_wait_seconds
                 if wait_interval_seconds is not None:
                     wait_period_kwargs['max_interval_seconds'] = 
wait_interval_seconds
+                if 'opc-work-request-id' not in result.headers:
+                    click.echo('Encountered error while waiting for work 
request to enter the specified state. Outputting last known resource state')
+                    cli_util.render_response(result, ctx)
+                    return
 
                 click.echo('Action completed. Waiting until the work request 
has entered state: {}'.format(wait_for_state), file=sys.stderr)
                 result = oci.wait_until(client, 
client.get_work_request(result.headers['opc-work-request-id']), 'status', 
wait_for_state, **wait_period_kwargs)
@@ -506,6 +530,10 @@
                     wait_period_kwargs['max_wait_seconds'] = max_wait_seconds
                 if wait_interval_seconds is not None:
                     wait_period_kwargs['max_interval_seconds'] = 
wait_interval_seconds
+                if 'opc-work-request-id' not in result.headers:
+                    click.echo('Encountered error while waiting for work 
request to enter the specified state. Outputting last known resource state')
+                    cli_util.render_response(result, ctx)
+                    return
 
                 click.echo('Action completed. Waiting until the work request 
has entered state: {}'.format(wait_for_state), file=sys.stderr)
                 result = oci.wait_until(client, 
client.get_work_request(result.headers['opc-work-request-id']), 'status', 
wait_for_state, **wait_period_kwargs)
@@ -558,6 +586,10 @@
                     wait_period_kwargs['max_wait_seconds'] = max_wait_seconds
                 if wait_interval_seconds is not None:
                     wait_period_kwargs['max_interval_seconds'] = 
wait_interval_seconds
+                if 'opc-work-request-id' not in result.headers:
+                    click.echo('Encountered error while waiting for work 
request to enter the specified state. Outputting last known resource state')
+                    cli_util.render_response(result, ctx)
+                    return
 
                 click.echo('Action completed. Waiting until the work request 
has entered state: {}'.format(wait_for_state), file=sys.stderr)
                 result = oci.wait_until(client, 
client.get_work_request(result.headers['opc-work-request-id']), 'status', 
wait_for_state, **wait_period_kwargs)
@@ -610,6 +642,10 @@
                     wait_period_kwargs['max_wait_seconds'] = max_wait_seconds
                 if wait_interval_seconds is not None:
                     wait_period_kwargs['max_interval_seconds'] = 
wait_interval_seconds
+                if 'opc-work-request-id' not in result.headers:
+                    click.echo('Encountered error while waiting for work 
request to enter the specified state. Outputting last known resource state')
+                    cli_util.render_response(result, ctx)
+                    return
 
                 click.echo('Action completed. Waiting until the work request 
has entered state: {}'.format(wait_for_state), file=sys.stderr)
                 result = oci.wait_until(client, 
client.get_work_request(result.headers['opc-work-request-id']), 'status', 
wait_for_state, **wait_period_kwargs)
@@ -777,6 +813,10 @@
                     wait_period_kwargs['max_wait_seconds'] = max_wait_seconds
                 if wait_interval_seconds is not None:
                     wait_period_kwargs['max_interval_seconds'] = 
wait_interval_seconds
+                if 'opc-work-request-id' not in result.headers:
+                    click.echo('Encountered error while waiting for work 
request to enter the specified state. Outputting last known resource state')
+                    cli_util.render_response(result, ctx)
+                    return
 
                 click.echo('Action completed. Waiting until the work request 
has entered state: {}'.format(wait_for_state), file=sys.stderr)
                 result = oci.wait_until(client, 
client.get_work_request(result.headers['opc-work-request-id']), 'status', 
wait_for_state, **wait_period_kwargs)
@@ -1258,6 +1298,10 @@
                     wait_period_kwargs['max_wait_seconds'] = max_wait_seconds
                 if wait_interval_seconds is not None:
                     wait_period_kwargs['max_interval_seconds'] = 
wait_interval_seconds
+                if 'opc-work-request-id' not in result.headers:
+                    click.echo('Encountered error while waiting for work 
request to enter the specified state. Outputting last known resource state')
+                    cli_util.render_response(result, ctx)
+                    return
 
                 click.echo('Action completed. Waiting until the work request 
has entered state: {}'.format(wait_for_state), file=sys.stderr)
                 result = oci.wait_until(client, 
client.get_work_request(result.headers['opc-work-request-id']), 'status', 
wait_for_state, **wait_period_kwargs)
@@ -1392,6 +1436,10 @@
                     wait_period_kwargs['max_wait_seconds'] = max_wait_seconds
                 if wait_interval_seconds is not None:
                     wait_period_kwargs['max_interval_seconds'] = 
wait_interval_seconds
+                if 'opc-work-request-id' not in result.headers:
+                    click.echo('Encountered error while waiting for work 
request to enter the specified state. Outputting last known resource state')
+                    cli_util.render_response(result, ctx)
+                    return
 
                 click.echo('Action completed. Waiting until the work request 
has entered state: {}'.format(wait_for_state), file=sys.stderr)
                 result = oci.wait_until(client, 
client.get_work_request(result.headers['opc-work-request-id']), 'status', 
wait_for_state, **wait_period_kwargs)
@@ -1443,6 +1491,10 @@
                     wait_period_kwargs['max_wait_seconds'] = max_wait_seconds
                 if wait_interval_seconds is not None:
                     wait_period_kwargs['max_interval_seconds'] = 
wait_interval_seconds
+                if 'opc-work-request-id' not in result.headers:
+                    click.echo('Encountered error while waiting for work 
request to enter the specified state. Outputting last known resource state')
+                    cli_util.render_response(result, ctx)
+                    return
 
                 click.echo('Action completed. Waiting until the work request 
has entered state: {}'.format(wait_for_state), file=sys.stderr)
                 result = oci.wait_until(client, 
client.get_work_request(result.headers['opc-work-request-id']), 'status', 
wait_for_state, **wait_period_kwargs)
@@ -1652,6 +1704,10 @@
                     wait_period_kwargs['max_wait_seconds'] = max_wait_seconds
                 if wait_interval_seconds is not None:
                     wait_period_kwargs['max_interval_seconds'] = 
wait_interval_seconds
+                if 'opc-work-request-id' not in result.headers:
+                    click.echo('Encountered error while waiting for work 
request to enter the specified state. Outputting last known resource state')
+                    cli_util.render_response(result, ctx)
+                    return
 
                 click.echo('Action completed. Waiting until the work request 
has entered state: {}'.format(wait_for_state), file=sys.stderr)
                 result = oci.wait_until(client, 
client.get_work_request(result.headers['opc-work-request-id']), 'status', 
wait_for_state, **wait_period_kwargs)
@@ -1712,6 +1768,10 @@
                     wait_period_kwargs['max_wait_seconds'] = max_wait_seconds
                 if wait_interval_seconds is not None:
                     wait_period_kwargs['max_interval_seconds'] = 
wait_interval_seconds
+                if 'opc-work-request-id' not in result.headers:
+                    click.echo('Encountered error while waiting for work 
request to enter the specified state. Outputting last known resource state')
+                    cli_util.render_response(result, ctx)
+                    return
 
                 click.echo('Action completed. Waiting until the work request 
has entered state: {}'.format(wait_for_state), file=sys.stderr)
                 result = oci.wait_until(client, 
client.get_work_request(result.headers['opc-work-request-id']), 'status', 
wait_for_state, **wait_period_kwargs)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/oci-cli-3.37.8/services/log_analytics/src/oci_cli_log_analytics/generated/loganalytics_cli.py
 
new/oci-cli-3.37.9/services/log_analytics/src/oci_cli_log_analytics/generated/loganalytics_cli.py
--- 
old/oci-cli-3.37.8/services/log_analytics/src/oci_cli_log_analytics/generated/loganalytics_cli.py
   2024-02-06 07:35:21.000000000 +0100
+++ 
new/oci-cli-3.37.9/services/log_analytics/src/oci_cli_log_analytics/generated/loganalytics_cli.py
   2024-02-13 11:20:04.000000000 +0100
@@ -1059,15 +1059,17 @@
 @cli_util.option('--properties', type=custom_types.CLI_COMPLEX_TYPE, 
help=u"""The name/value pairs for parameter values to be used in file patterns 
specified in log sources.""" + custom_types.cli_complex_type.COMPLEX_TYPE_HELP)
 @cli_util.option('--freeform-tags', type=custom_types.CLI_COMPLEX_TYPE, 
help=u"""Simple key-value pair that is applied without any predefined name, 
type or scope. Exists for cross-compatibility only. Example: `{\"bar-key\": 
\"value\"}`""" + custom_types.cli_complex_type.COMPLEX_TYPE_HELP)
 @cli_util.option('--defined-tags', type=custom_types.CLI_COMPLEX_TYPE, 
help=u"""Defined tags for this resource. Each key is predefined and scoped to a 
namespace. Example: `{\"foo-namespace\": {\"bar-key\": \"value\"}}`""" + 
custom_types.cli_complex_type.COMPLEX_TYPE_HELP)
+@cli_util.option('--time-last-discovered', type=custom_types.CLI_DATETIME, 
help=u"""The date and time the resource was last discovered, in the format 
defined by RFC3339.""" + 
custom_types.CLI_DATETIME.VALID_DATETIME_CLI_HELP_MESSAGE)
+@cli_util.option('--metadata', type=custom_types.CLI_COMPLEX_TYPE, 
help=u"""""" + custom_types.cli_complex_type.COMPLEX_TYPE_HELP)
 @cli_util.option('--wait-for-state', 
type=custom_types.CliCaseInsensitiveChoice(["ACTIVE", "DELETED"]), 
multiple=True, help="""This operation creates, modifies or deletes a resource 
that has a defined lifecycle state. Specify this option to perform the action 
and then wait until the resource reaches a given lifecycle state. Multiple 
states can be specified, returning on the first state. For example, 
--wait-for-state SUCCEEDED --wait-for-state FAILED would return on whichever 
lifecycle state is reached first. If timeout is reached, a return code of 2 is 
returned. For any other error, a return code of 1 is returned.""")
 @cli_util.option('--max-wait-seconds', type=click.INT, help="""The maximum 
time to wait for the resource to reach the lifecycle state defined by 
--wait-for-state. Defaults to 1200 seconds.""")
 @cli_util.option('--wait-interval-seconds', type=click.INT, help="""Check 
every --wait-interval-seconds to see whether the resource has reached the 
lifecycle state defined by --wait-for-state. Defaults to 30 seconds.""")
-@json_skeleton_utils.get_cli_json_input_option({'properties': {'module': 
'log_analytics', 'class': 'dict(str, string)'}, 'freeform-tags': {'module': 
'log_analytics', 'class': 'dict(str, string)'}, 'defined-tags': {'module': 
'log_analytics', 'class': 'dict(str, dict(str, object))'}})
+@json_skeleton_utils.get_cli_json_input_option({'properties': {'module': 
'log_analytics', 'class': 'dict(str, string)'}, 'freeform-tags': {'module': 
'log_analytics', 'class': 'dict(str, string)'}, 'defined-tags': {'module': 
'log_analytics', 'class': 'dict(str, dict(str, object))'}, 'metadata': 
{'module': 'log_analytics', 'class': 'LogAnalyticsMetadataDetails'}})
 @cli_util.help_option
 @click.pass_context
-@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={'properties':
 {'module': 'log_analytics', 'class': 'dict(str, string)'}, 'freeform-tags': 
{'module': 'log_analytics', 'class': 'dict(str, string)'}, 'defined-tags': 
{'module': 'log_analytics', 'class': 'dict(str, dict(str, object))'}}, 
output_type={'module': 'log_analytics', 'class': 'LogAnalyticsEntity'})
+@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={'properties':
 {'module': 'log_analytics', 'class': 'dict(str, string)'}, 'freeform-tags': 
{'module': 'log_analytics', 'class': 'dict(str, string)'}, 'defined-tags': 
{'module': 'log_analytics', 'class': 'dict(str, dict(str, object))'}, 
'metadata': {'module': 'log_analytics', 'class': 
'LogAnalyticsMetadataDetails'}}, output_type={'module': 'log_analytics', 
'class': 'LogAnalyticsEntity'})
 @cli_util.wrap_exceptions
-def create_log_analytics_entity(ctx, from_json, wait_for_state, 
max_wait_seconds, wait_interval_seconds, namespace_name, name, compartment_id, 
entity_type_name, management_agent_id, cloud_resource_id, timezone_region, 
hostname, source_id, properties, freeform_tags, defined_tags):
+def create_log_analytics_entity(ctx, from_json, wait_for_state, 
max_wait_seconds, wait_interval_seconds, namespace_name, name, compartment_id, 
entity_type_name, management_agent_id, cloud_resource_id, timezone_region, 
hostname, source_id, properties, freeform_tags, defined_tags, 
time_last_discovered, metadata):
 
     if isinstance(namespace_name, six.string_types) and 
len(namespace_name.strip()) == 0:
         raise click.UsageError('Parameter --namespace-name cannot be 
whitespace or empty string')
@@ -1104,6 +1106,12 @@
     if defined_tags is not None:
         _details['definedTags'] = 
cli_util.parse_json_parameter("defined_tags", defined_tags)
 
+    if time_last_discovered is not None:
+        _details['timeLastDiscovered'] = time_last_discovered
+
+    if metadata is not None:
+        _details['metadata'] = cli_util.parse_json_parameter("metadata", 
metadata)
+
     client = cli_util.build_client('log_analytics', 'log_analytics', ctx)
     result = client.create_log_analytics_entity(
         namespace_name=namespace_name,
@@ -1237,6 +1245,8 @@
 @cli_util.option('--log-set-ext-regex', help=u"""The regex to be applied 
against given logSetKey. Regex has to be in string escaped format.""")
 @cli_util.option('--overrides', type=custom_types.CLI_COMPLEX_TYPE, 
help=u"""The override is used to modify some important configuration properties 
for objects matching a specific pattern inside the bucket. Supported propeties 
for override are: logSourceName, charEncoding, entityId. Supported matchType 
for override are \"contains\".""" + 
custom_types.cli_complex_type.COMPLEX_TYPE_HELP)
 @cli_util.option('--object-name-filters', type=custom_types.CLI_COMPLEX_TYPE, 
help=u"""When the filters are provided, only the objects matching the filters 
are picked up for processing. The matchType supported is exact match and 
accommodates wildcard \"*\". For more information on filters, see [Event 
Filters].""" + custom_types.cli_complex_type.COMPLEX_TYPE_HELP)
+@cli_util.option('--log-type', 
type=custom_types.CliCaseInsensitiveChoice(["LOG", "LOG_EVENTS"]), 
help=u"""Type of files/objects in this object collection rule.""")
+@cli_util.option('--is-force-historic-collection', type=click.BOOL, 
help=u"""Flag to allow historic collection if poll period overlaps with 
existing ACTIVE collection rule""")
 @cli_util.option('--defined-tags', type=custom_types.CLI_COMPLEX_TYPE, 
help=u"""Defined tags for this resource. Each key is predefined and scoped to a 
namespace. Example: `{\"foo-namespace\": {\"bar-key\": \"value\"}}`""" + 
custom_types.cli_complex_type.COMPLEX_TYPE_HELP)
 @cli_util.option('--freeform-tags', type=custom_types.CLI_COMPLEX_TYPE, 
help=u"""Simple key-value pair that is applied without any predefined name, 
type or scope. Exists for cross-compatibility only. Example: `{\"bar-key\": 
\"value\"}`""" + custom_types.cli_complex_type.COMPLEX_TYPE_HELP)
 @cli_util.option('--wait-for-state', 
type=custom_types.CliCaseInsensitiveChoice(["ACTIVE", "DELETED", "INACTIVE"]), 
multiple=True, help="""This operation creates, modifies or deletes a resource 
that has a defined lifecycle state. Specify this option to perform the action 
and then wait until the resource reaches a given lifecycle state. Multiple 
states can be specified, returning on the first state. For example, 
--wait-for-state SUCCEEDED --wait-for-state FAILED would return on whichever 
lifecycle state is reached first. If timeout is reached, a return code of 2 is 
returned. For any other error, a return code of 1 is returned.""")
@@ -1247,7 +1257,7 @@
 @click.pass_context
 
@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={'overrides':
 {'module': 'log_analytics', 'class': 'dict(str, list[PropertyOverride])'}, 
'object-name-filters': {'module': 'log_analytics', 'class': 'list[string]'}, 
'defined-tags': {'module': 'log_analytics', 'class': 'dict(str, dict(str, 
object))'}, 'freeform-tags': {'module': 'log_analytics', 'class': 'dict(str, 
string)'}}, output_type={'module': 'log_analytics', 'class': 
'LogAnalyticsObjectCollectionRule'})
 @cli_util.wrap_exceptions
-def create_log_analytics_object_collection_rule(ctx, from_json, 
wait_for_state, max_wait_seconds, wait_interval_seconds, namespace_name, name, 
compartment_id, os_namespace, os_bucket_name, log_group_id, log_source_name, 
description, collection_type, poll_since, poll_till, entity_id, char_encoding, 
is_enabled, timezone, log_set, log_set_key, log_set_ext_regex, overrides, 
object_name_filters, defined_tags, freeform_tags):
+def create_log_analytics_object_collection_rule(ctx, from_json, 
wait_for_state, max_wait_seconds, wait_interval_seconds, namespace_name, name, 
compartment_id, os_namespace, os_bucket_name, log_group_id, log_source_name, 
description, collection_type, poll_since, poll_till, entity_id, char_encoding, 
is_enabled, timezone, log_set, log_set_key, log_set_ext_regex, overrides, 
object_name_filters, log_type, is_force_historic_collection, defined_tags, 
freeform_tags):
 
     if isinstance(namespace_name, six.string_types) and 
len(namespace_name.strip()) == 0:
         raise click.UsageError('Parameter --namespace-name cannot be 
whitespace or empty string')
@@ -1302,6 +1312,12 @@
     if object_name_filters is not None:
         _details['objectNameFilters'] = 
cli_util.parse_json_parameter("object_name_filters", object_name_filters)
 
+    if log_type is not None:
+        _details['logType'] = log_type
+
+    if is_force_historic_collection is not None:
+        _details['isForceHistoricCollection'] = is_force_historic_collection
+
     if defined_tags is not None:
         _details['definedTags'] = 
cli_util.parse_json_parameter("defined_tags", defined_tags)
 
@@ -2685,7 +2701,7 @@
 @cli_util.option('--scope-filters', type=custom_types.CLI_COMPLEX_TYPE, 
help=u"""List of filters to be applied when the query executes. More than one 
filter per field is not permitted.
 
 This option is a JSON list with items of type ScopeFilter.  For documentation 
on ScopeFilter please see our API reference: 
https://docs.cloud.oracle.com/api/#/en/loganalytics/20200601/datatypes/ScopeFilter.""";
 + custom_types.cli_complex_type.COMPLEX_TYPE_HELP)
-@cli_util.option('--max-total-count', type=click.INT, help=u"""Maximum number 
of results retrieved from data source is determined by the specific query used 
and the maxTotalCount input field. If the export results can be streamed, the 
maximum will be 1,000,000. If the results cannot be streamed, the maximum limit 
is 500 for queries that include the link command and 10,000 for the queries 
that does not include the link command.
+@cli_util.option('--max-total-count', type=click.INT, help=u"""Maximum number 
of results retrieved from data source is determined by the specific query used 
and the maxTotalCount input field. If the export results can be streamed, the 
maximum will be 1,000,000. If the results cannot be streamed, the maximum limit 
is 500 for queries that include the link command and 10,000 for the queries 
that do not include the link command.
 
 Queries that include certain commands such as head, tail or stats cannot be 
streamed and are subject to a maximum of 10,000 results. Queries that include 
the sort command cannot be streamed unless the sort fields are restricted to id 
and/or time. The maximum number of results retrieved is the lesser of the 
maxTotalCount input provided and the applicable limit described above.""")
 @cli_util.option('--time-filter', type=custom_types.CLI_COMPLEX_TYPE, 
help=u"""""" + custom_types.cli_complex_type.COMPLEX_TYPE_HELP)
@@ -2815,13 +2831,20 @@
 @cli_util.option('--categories', type=custom_types.CLI_COMPLEX_TYPE, 
help=u"""An array of categories assigned to this parser. The isSystem flag 
denotes if each category assignment is user-created or Oracle-defined.
 
 This option is a JSON list with items of type LogAnalyticsCategory.  For 
documentation on LogAnalyticsCategory please see our API reference: 
https://docs.cloud.oracle.com/api/#/en/loganalytics/20200601/datatypes/LogAnalyticsCategory.""";
 + custom_types.cli_complex_type.COMPLEX_TYPE_HELP)
+@cli_util.option('--is-position-aware', type=click.BOOL, help=u"""A flag 
indicating whether the parser is positionally aware.""")
+@cli_util.option('--dependent-sources', type=custom_types.CLI_COMPLEX_TYPE, 
help=u"""A list of sources that depend on the parser, either directly or 
indirectly.
+
+This option is a JSON list with items of type DependentSource.  For 
documentation on DependentSource please see our API reference: 
https://docs.cloud.oracle.com/api/#/en/loganalytics/20200601/datatypes/DependentSource.""";
 + custom_types.cli_complex_type.COMPLEX_TYPE_HELP)
+@cli_util.option('--dependent-parsers', type=custom_types.CLI_COMPLEX_TYPE, 
help=u"""A list of sub parsers used by this parser.
+
+This option is a JSON list with items of type DependentParser.  For 
documentation on DependentParser please see our API reference: 
https://docs.cloud.oracle.com/api/#/en/loganalytics/20200601/datatypes/DependentParser.""";
 + custom_types.cli_complex_type.COMPLEX_TYPE_HELP)
 @cli_util.option('--parser-type', 
type=custom_types.CliCaseInsensitiveChoice(["XML", "JSON", "DELIMITED"]), 
help=u"""The parser type - possible values are XML, JSON or DELIMITED.""")
-@json_skeleton_utils.get_cli_json_input_option({'field-maps': {'module': 
'log_analytics', 'class': 'list[LogAnalyticsParserField]'}, 'mapped-parsers': 
{'module': 'log_analytics', 'class': 'list[LogAnalyticsParser]'}, 
'parser-filter': {'module': 'log_analytics', 'class': 
'LogAnalyticsParserFilter'}, 'parser-functions': {'module': 'log_analytics', 
'class': 'list[LogAnalyticsParserFunction]'}, 'sources': {'module': 
'log_analytics', 'class': 'list[LogAnalyticsSource]'}, 'categories': {'module': 
'log_analytics', 'class': 'list[LogAnalyticsCategory]'}})
+@json_skeleton_utils.get_cli_json_input_option({'field-maps': {'module': 
'log_analytics', 'class': 'list[LogAnalyticsParserField]'}, 'mapped-parsers': 
{'module': 'log_analytics', 'class': 'list[LogAnalyticsParser]'}, 
'parser-filter': {'module': 'log_analytics', 'class': 
'LogAnalyticsParserFilter'}, 'parser-functions': {'module': 'log_analytics', 
'class': 'list[LogAnalyticsParserFunction]'}, 'sources': {'module': 
'log_analytics', 'class': 'list[LogAnalyticsSource]'}, 'categories': {'module': 
'log_analytics', 'class': 'list[LogAnalyticsCategory]'}, 'dependent-sources': 
{'module': 'log_analytics', 'class': 'list[DependentSource]'}, 
'dependent-parsers': {'module': 'log_analytics', 'class': 
'list[DependentParser]'}})
 @cli_util.help_option
 @click.pass_context
-@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={'field-maps':
 {'module': 'log_analytics', 'class': 'list[LogAnalyticsParserField]'}, 
'mapped-parsers': {'module': 'log_analytics', 'class': 
'list[LogAnalyticsParser]'}, 'parser-filter': {'module': 'log_analytics', 
'class': 'LogAnalyticsParserFilter'}, 'parser-functions': {'module': 
'log_analytics', 'class': 'list[LogAnalyticsParserFunction]'}, 'sources': 
{'module': 'log_analytics', 'class': 'list[LogAnalyticsSource]'}, 'categories': 
{'module': 'log_analytics', 'class': 'list[LogAnalyticsCategory]'}}, 
output_type={'module': 'log_analytics', 'class': 'ExtractLogFieldResults'})
+@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={'field-maps':
 {'module': 'log_analytics', 'class': 'list[LogAnalyticsParserField]'}, 
'mapped-parsers': {'module': 'log_analytics', 'class': 
'list[LogAnalyticsParser]'}, 'parser-filter': {'module': 'log_analytics', 
'class': 'LogAnalyticsParserFilter'}, 'parser-functions': {'module': 
'log_analytics', 'class': 'list[LogAnalyticsParserFunction]'}, 'sources': 
{'module': 'log_analytics', 'class': 'list[LogAnalyticsSource]'}, 'categories': 
{'module': 'log_analytics', 'class': 'list[LogAnalyticsCategory]'}, 
'dependent-sources': {'module': 'log_analytics', 'class': 
'list[DependentSource]'}, 'dependent-parsers': {'module': 'log_analytics', 
'class': 'list[DependentParser]'}}, output_type={'module': 'log_analytics', 
'class': 'ExtractLogFieldResults'})
 @cli_util.wrap_exceptions
-def extract_structured_log_field_paths(ctx, from_json, namespace_name, 
content, description, display_name, edit_version, encoding, example_content, 
field_maps, footer_content, header_content, name, is_default, 
is_single_line_content, is_system, language, time_updated, 
log_type_test_request_version, mapped_parsers, parser_ignoreline_characters, 
is_hidden, parser_sequence, parser_timezone, parser_filter, 
is_parser_written_once, parser_functions, sources_count, sources, 
should_tokenize_original_text, field_delimiter, field_qualifier, type, 
is_user_deleted, is_namespace_aware, categories, parser_type):
+def extract_structured_log_field_paths(ctx, from_json, namespace_name, 
content, description, display_name, edit_version, encoding, example_content, 
field_maps, footer_content, header_content, name, is_default, 
is_single_line_content, is_system, language, time_updated, 
log_type_test_request_version, mapped_parsers, parser_ignoreline_characters, 
is_hidden, parser_sequence, parser_timezone, parser_filter, 
is_parser_written_once, parser_functions, sources_count, sources, 
should_tokenize_original_text, field_delimiter, field_qualifier, type, 
is_user_deleted, is_namespace_aware, categories, is_position_aware, 
dependent_sources, dependent_parsers, parser_type):
 
     if isinstance(namespace_name, six.string_types) and 
len(namespace_name.strip()) == 0:
         raise click.UsageError('Parameter --namespace-name cannot be 
whitespace or empty string')
@@ -2932,6 +2955,15 @@
     if categories is not None:
         _details['categories'] = cli_util.parse_json_parameter("categories", 
categories)
 
+    if is_position_aware is not None:
+        _details['isPositionAware'] = is_position_aware
+
+    if dependent_sources is not None:
+        _details['dependentSources'] = 
cli_util.parse_json_parameter("dependent_sources", dependent_sources)
+
+    if dependent_parsers is not None:
+        _details['dependentParsers'] = 
cli_util.parse_json_parameter("dependent_parsers", dependent_parsers)
+
     client = cli_util.build_client('log_analytics', 'log_analytics', ctx)
     result = client.extract_structured_log_field_paths(
         namespace_name=namespace_name,
@@ -2986,13 +3018,20 @@
 @cli_util.option('--categories', type=custom_types.CLI_COMPLEX_TYPE, 
help=u"""An array of categories assigned to this parser. The isSystem flag 
denotes if each category assignment is user-created or Oracle-defined.
 
 This option is a JSON list with items of type LogAnalyticsCategory.  For 
documentation on LogAnalyticsCategory please see our API reference: 
https://docs.cloud.oracle.com/api/#/en/loganalytics/20200601/datatypes/LogAnalyticsCategory.""";
 + custom_types.cli_complex_type.COMPLEX_TYPE_HELP)
+@cli_util.option('--is-position-aware', type=click.BOOL, help=u"""A flag 
indicating whether the parser is positionally aware.""")
+@cli_util.option('--dependent-sources', type=custom_types.CLI_COMPLEX_TYPE, 
help=u"""A list of sources that depend on the parser, either directly or 
indirectly.
+
+This option is a JSON list with items of type DependentSource.  For 
documentation on DependentSource please see our API reference: 
https://docs.cloud.oracle.com/api/#/en/loganalytics/20200601/datatypes/DependentSource.""";
 + custom_types.cli_complex_type.COMPLEX_TYPE_HELP)
+@cli_util.option('--dependent-parsers', type=custom_types.CLI_COMPLEX_TYPE, 
help=u"""A list of sub parsers used by this parser.
+
+This option is a JSON list with items of type DependentParser.  For 
documentation on DependentParser please see our API reference: 
https://docs.cloud.oracle.com/api/#/en/loganalytics/20200601/datatypes/DependentParser.""";
 + custom_types.cli_complex_type.COMPLEX_TYPE_HELP)
 @cli_util.option('--parser-type', 
type=custom_types.CliCaseInsensitiveChoice(["XML", "JSON", "DELIMITED"]), 
help=u"""The parser type - possible values are XML, JSON or DELIMITED.""")
-@json_skeleton_utils.get_cli_json_input_option({'field-maps': {'module': 
'log_analytics', 'class': 'list[LogAnalyticsParserField]'}, 'mapped-parsers': 
{'module': 'log_analytics', 'class': 'list[LogAnalyticsParser]'}, 
'parser-filter': {'module': 'log_analytics', 'class': 
'LogAnalyticsParserFilter'}, 'parser-functions': {'module': 'log_analytics', 
'class': 'list[LogAnalyticsParserFunction]'}, 'sources': {'module': 
'log_analytics', 'class': 'list[LogAnalyticsSource]'}, 'categories': {'module': 
'log_analytics', 'class': 'list[LogAnalyticsCategory]'}})
+@json_skeleton_utils.get_cli_json_input_option({'field-maps': {'module': 
'log_analytics', 'class': 'list[LogAnalyticsParserField]'}, 'mapped-parsers': 
{'module': 'log_analytics', 'class': 'list[LogAnalyticsParser]'}, 
'parser-filter': {'module': 'log_analytics', 'class': 
'LogAnalyticsParserFilter'}, 'parser-functions': {'module': 'log_analytics', 
'class': 'list[LogAnalyticsParserFunction]'}, 'sources': {'module': 
'log_analytics', 'class': 'list[LogAnalyticsSource]'}, 'categories': {'module': 
'log_analytics', 'class': 'list[LogAnalyticsCategory]'}, 'dependent-sources': 
{'module': 'log_analytics', 'class': 'list[DependentSource]'}, 
'dependent-parsers': {'module': 'log_analytics', 'class': 
'list[DependentParser]'}})
 @cli_util.help_option
 @click.pass_context
-@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={'field-maps':
 {'module': 'log_analytics', 'class': 'list[LogAnalyticsParserField]'}, 
'mapped-parsers': {'module': 'log_analytics', 'class': 
'list[LogAnalyticsParser]'}, 'parser-filter': {'module': 'log_analytics', 
'class': 'LogAnalyticsParserFilter'}, 'parser-functions': {'module': 
'log_analytics', 'class': 'list[LogAnalyticsParserFunction]'}, 'sources': 
{'module': 'log_analytics', 'class': 'list[LogAnalyticsSource]'}, 'categories': 
{'module': 'log_analytics', 'class': 'list[LogAnalyticsCategory]'}}, 
output_type={'module': 'log_analytics', 'class': 'ExtractLogHeaderResults'})
+@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={'field-maps':
 {'module': 'log_analytics', 'class': 'list[LogAnalyticsParserField]'}, 
'mapped-parsers': {'module': 'log_analytics', 'class': 
'list[LogAnalyticsParser]'}, 'parser-filter': {'module': 'log_analytics', 
'class': 'LogAnalyticsParserFilter'}, 'parser-functions': {'module': 
'log_analytics', 'class': 'list[LogAnalyticsParserFunction]'}, 'sources': 
{'module': 'log_analytics', 'class': 'list[LogAnalyticsSource]'}, 'categories': 
{'module': 'log_analytics', 'class': 'list[LogAnalyticsCategory]'}, 
'dependent-sources': {'module': 'log_analytics', 'class': 
'list[DependentSource]'}, 'dependent-parsers': {'module': 'log_analytics', 
'class': 'list[DependentParser]'}}, output_type={'module': 'log_analytics', 
'class': 'ExtractLogHeaderResults'})
 @cli_util.wrap_exceptions
-def extract_structured_log_header_paths(ctx, from_json, namespace_name, 
content, description, display_name, edit_version, encoding, example_content, 
field_maps, footer_content, header_content, name, is_default, 
is_single_line_content, is_system, language, time_updated, 
log_type_test_request_version, mapped_parsers, parser_ignoreline_characters, 
is_hidden, parser_sequence, parser_timezone, parser_filter, 
is_parser_written_once, parser_functions, sources_count, sources, 
should_tokenize_original_text, field_delimiter, field_qualifier, type, 
is_user_deleted, is_namespace_aware, categories, parser_type):
+def extract_structured_log_header_paths(ctx, from_json, namespace_name, 
content, description, display_name, edit_version, encoding, example_content, 
field_maps, footer_content, header_content, name, is_default, 
is_single_line_content, is_system, language, time_updated, 
log_type_test_request_version, mapped_parsers, parser_ignoreline_characters, 
is_hidden, parser_sequence, parser_timezone, parser_filter, 
is_parser_written_once, parser_functions, sources_count, sources, 
should_tokenize_original_text, field_delimiter, field_qualifier, type, 
is_user_deleted, is_namespace_aware, categories, is_position_aware, 
dependent_sources, dependent_parsers, parser_type):
 
     if isinstance(namespace_name, six.string_types) and 
len(namespace_name.strip()) == 0:
         raise click.UsageError('Parameter --namespace-name cannot be 
whitespace or empty string')
@@ -3103,6 +3142,15 @@
     if categories is not None:
         _details['categories'] = cli_util.parse_json_parameter("categories", 
categories)
 
+    if is_position_aware is not None:
+        _details['isPositionAware'] = is_position_aware
+
+    if dependent_sources is not None:
+        _details['dependentSources'] = 
cli_util.parse_json_parameter("dependent_sources", dependent_sources)
+
+    if dependent_parsers is not None:
+        _details['dependentParsers'] = 
cli_util.parse_json_parameter("dependent_parsers", dependent_parsers)
+
     client = cli_util.build_client('log_analytics', 'log_analytics', ctx)
     result = client.extract_structured_log_header_paths(
         namespace_name=namespace_name,
@@ -5164,20 +5212,21 @@
 @cli_util.option('--hostname', help=u"""A filter to return only log analytics 
entities whose hostname matches the entire hostname given.""")
 @cli_util.option('--hostname-contains', help=u"""A filter to return only log 
analytics entities whose hostname contains the substring given. The match is 
case-insensitive.""")
 @cli_util.option('--source-id', help=u"""A filter to return only log analytics 
entities whose sourceId matches the sourceId given.""")
-@cli_util.option('--creation-source-type', 
type=custom_types.CliCaseInsensitiveChoice(["EM_BRIDGE", "BULK_DISCOVERY", 
"SERVICE_CONNECTOR_HUB", "DISCOVERY", "NONE"]), multiple=True, help=u"""A 
filter to return only those log analytics entities with the specified 
auto-creation source.""")
+@cli_util.option('--creation-source-type', 
type=custom_types.CliCaseInsensitiveChoice(["EM_BRIDGE", "BULK_DISCOVERY", 
"SERVICE_CONNECTOR_HUB", "DISCOVERY", "LOGGING_ANALYTICS", "NONE"]), 
multiple=True, help=u"""A filter to return only those log analytics entities 
with the specified auto-creation source.""")
 @cli_util.option('--creation-source-details', help=u"""A filter to return only 
log analytics entities whose auto-creation source details contains the 
specified string.""")
 @cli_util.option('--limit', type=click.INT, help=u"""The maximum number of 
items to return.""")
 @cli_util.option('--page', help=u"""The page token representing the page at 
which to start retrieving results. This is usually retrieved from a previous 
list call.""")
 @cli_util.option('--sort-order', 
type=custom_types.CliCaseInsensitiveChoice(["ASC", "DESC"]), help=u"""The sort 
order to use, either ascending (`ASC`) or descending (`DESC`).""")
 @cli_util.option('--sort-by', 
type=custom_types.CliCaseInsensitiveChoice(["timeCreated", "timeUpdated", 
"name"]), help=u"""The field to sort entities by. Only one sort order may be 
provided. Default order for timeCreated and timeUpdated is descending. Default 
order for entity name is ascending. If no value is specified timeCreated is 
default.""")
+@cli_util.option('--metadata-equals', multiple=True, help=u"""A filter to 
return only log analytics entities whose metadata name, value and type matches 
the specified string. Each item in the array has the format 
\"{name}:{value}:{type}\".  All inputs are case-insensitive.""")
 @cli_util.option('--all', 'all_pages', is_flag=True, help="""Fetches all pages 
of results. If you provide this option, then you cannot provide the --limit 
option.""")
 @cli_util.option('--page-size', type=click.INT, help="""When fetching results, 
the number of results to fetch per call. Only valid when used with --all or 
--limit, and ignored otherwise.""")
-@json_skeleton_utils.get_cli_json_input_option({'entity-type-name': {'module': 
'log_analytics', 'class': 'list[string]'}})
+@json_skeleton_utils.get_cli_json_input_option({'entity-type-name': {'module': 
'log_analytics', 'class': 'list[string]'}, 'metadata-equals': {'module': 
'log_analytics', 'class': 'list[string]'}})
 @cli_util.help_option
 @click.pass_context
-@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={'entity-type-name':
 {'module': 'log_analytics', 'class': 'list[string]'}}, output_type={'module': 
'log_analytics', 'class': 'LogAnalyticsEntityCollection'})
+@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={'entity-type-name':
 {'module': 'log_analytics', 'class': 'list[string]'}, 'metadata-equals': 
{'module': 'log_analytics', 'class': 'list[string]'}}, output_type={'module': 
'log_analytics', 'class': 'LogAnalyticsEntityCollection'})
 @cli_util.wrap_exceptions
-def list_log_analytics_entities(ctx, from_json, all_pages, page_size, 
namespace_name, compartment_id, name, name_contains, entity_type_name, 
cloud_resource_id, lifecycle_state, lifecycle_details_contains, 
is_management_agent_id_null, hostname, hostname_contains, source_id, 
creation_source_type, creation_source_details, limit, page, sort_order, 
sort_by):
+def list_log_analytics_entities(ctx, from_json, all_pages, page_size, 
namespace_name, compartment_id, name, name_contains, entity_type_name, 
cloud_resource_id, lifecycle_state, lifecycle_details_contains, 
is_management_agent_id_null, hostname, hostname_contains, source_id, 
creation_source_type, creation_source_details, limit, page, sort_order, 
sort_by, metadata_equals):
 
     if all_pages and limit:
         raise click.UsageError('If you provide the --all option you cannot 
provide the --limit option')
@@ -5218,6 +5267,8 @@
         kwargs['sort_order'] = sort_order
     if sort_by is not None:
         kwargs['sort_by'] = sort_by
+    if metadata_equals is not None and len(metadata_equals) > 0:
+        kwargs['metadata_equals'] = metadata_equals
     kwargs['opc_request_id'] = 
cli_util.use_or_generate_request_id(ctx.obj['request_id'])
     client = cli_util.build_client('log_analytics', 'log_analytics', ctx)
     if all_pages:
@@ -5256,14 +5307,15 @@
 @cli_util.option('--page', help=u"""The page token representing the page at 
which to start retrieving results. This is usually retrieved from a previous 
list call.""")
 @cli_util.option('--sort-order', 
type=custom_types.CliCaseInsensitiveChoice(["ASC", "DESC"]), help=u"""The sort 
order to use, either ascending (`ASC`) or descending (`DESC`).""")
 @cli_util.option('--sort-by', 
type=custom_types.CliCaseInsensitiveChoice(["timeCreated", "timeUpdated", 
"name"]), help=u"""The field to sort entities by. Only one sort order may be 
provided. Default order for timeCreated and timeUpdated is descending. Default 
order for entity name is ascending. If no value is specified timeCreated is 
default.""")
+@cli_util.option('--metadata-equals', multiple=True, help=u"""A filter to 
return only log analytics entities whose metadata name, value and type matches 
the specified string. Each item in the array has the format 
\"{name}:{value}:{type}\".  All inputs are case-insensitive.""")
 @cli_util.option('--all', 'all_pages', is_flag=True, help="""Fetches all pages 
of results. If you provide this option, then you cannot provide the --limit 
option.""")
 @cli_util.option('--page-size', type=click.INT, help="""When fetching results, 
the number of results to fetch per call. Only valid when used with --all or 
--limit, and ignored otherwise.""")
-@json_skeleton_utils.get_cli_json_input_option({})
+@json_skeleton_utils.get_cli_json_input_option({'metadata-equals': {'module': 
'log_analytics', 'class': 'list[string]'}})
 @cli_util.help_option
 @click.pass_context
-@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={},
 output_type={'module': 'log_analytics', 'class': 
'LogAnalyticsEntityTopologyCollection'})
+@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={'metadata-equals':
 {'module': 'log_analytics', 'class': 'list[string]'}}, output_type={'module': 
'log_analytics', 'class': 'LogAnalyticsEntityTopologyCollection'})
 @cli_util.wrap_exceptions
-def list_log_analytics_entity_topology(ctx, from_json, all_pages, page_size, 
namespace_name, log_analytics_entity_id, lifecycle_state, limit, page, 
sort_order, sort_by):
+def list_log_analytics_entity_topology(ctx, from_json, all_pages, page_size, 
namespace_name, log_analytics_entity_id, lifecycle_state, limit, page, 
sort_order, sort_by, metadata_equals):
 
     if all_pages and limit:
         raise click.UsageError('If you provide the --all option you cannot 
provide the --limit option')
@@ -5285,6 +5337,8 @@
         kwargs['sort_order'] = sort_order
     if sort_by is not None:
         kwargs['sort_by'] = sort_by
+    if metadata_equals is not None and len(metadata_equals) > 0:
+        kwargs['metadata_equals'] = metadata_equals
     kwargs['opc_request_id'] = 
cli_util.use_or_generate_request_id(ctx.obj['request_id'])
     client = cli_util.build_client('log_analytics', 'log_analytics', ctx)
     if all_pages:
@@ -5574,7 +5628,7 @@
 
 
@log_analytics_lookup_group.command(name=cli_util.override('log_analytics.list_lookups.command_name',
 'list-lookups'), help=u"""Returns a list of lookups, containing detailed 
information about them. You may limit the number of results, provide sorting 
order, and filter by information such as lookup name, description and type. 
\n[Command Reference](listLookups)""")
 @cli_util.option('--namespace-name', required=True, help=u"""The Logging 
Analytics namespace used for the request.""")
-@cli_util.option('--type', required=True, 
type=custom_types.CliCaseInsensitiveChoice(["Lookup", "Dictionary"]), 
help=u"""The lookup type.  Valid values are Lookup or Dictionary.""")
+@cli_util.option('--type', required=True, 
type=custom_types.CliCaseInsensitiveChoice(["Lookup", "Dictionary", "Module"]), 
help=u"""The lookup type.  Valid values are Lookup, Dictionary or Module.""")
 @cli_util.option('--lookup-display-text', help=u"""The lookup text used for 
filtering.  Only lookups with the specified name or description will be 
returned.""")
 @cli_util.option('--is-system', 
type=custom_types.CliCaseInsensitiveChoice(["ALL", "CUSTOM", "BUILT_IN"]), 
help=u"""The system value used for filtering.  Only items with the specified 
system value will be returned.  Valid values are built in, custom (for user 
defined items), or all (for all items, regardless of system value).""")
 @cli_util.option('--sort-by', 
type=custom_types.CliCaseInsensitiveChoice(["displayName", "status", "type", 
"updatedTime", "creationType"]), help=u"""sort by field""")
@@ -6250,6 +6304,7 @@
 @cli_util.option('--compartment-id', required=True, help=u"""The ID of the 
compartment in which to list resources.""")
 @cli_util.option('--display-name', help=u"""A filter to return rules whose 
displayName matches in whole or in part the specified value. The match is 
case-insensitive.""")
 @cli_util.option('--kind', 
type=custom_types.CliCaseInsensitiveChoice(["INGEST_TIME", "SAVED_SEARCH", 
"ALL"]), help=u"""The rule kind used for filtering. Only rules of the specified 
kind will be returned.""")
+@cli_util.option('--target-service', help=u"""The target service to use for 
filtering.""")
 @cli_util.option('--lifecycle-state', 
type=custom_types.CliCaseInsensitiveChoice(["ACTIVE", "DELETED"]), help=u"""The 
rule lifecycle state used for filtering. Currently supported values are ACTIVE 
and DELETED.""")
 @cli_util.option('--limit', type=click.INT, help=u"""The maximum number of 
items to return.""")
 @cli_util.option('--page', help=u"""The page token representing the page at 
which to start retrieving results. This is usually retrieved from a previous 
list call.""")
@@ -6262,7 +6317,7 @@
 @click.pass_context
 
@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={},
 output_type={'module': 'log_analytics', 'class': 'RuleSummaryCollection'})
 @cli_util.wrap_exceptions
-def list_rules(ctx, from_json, all_pages, page_size, namespace_name, 
compartment_id, display_name, kind, lifecycle_state, limit, page, sort_order, 
sort_by):
+def list_rules(ctx, from_json, all_pages, page_size, namespace_name, 
compartment_id, display_name, kind, target_service, lifecycle_state, limit, 
page, sort_order, sort_by):
 
     if all_pages and limit:
         raise click.UsageError('If you provide the --all option you cannot 
provide the --limit option')
@@ -6275,6 +6330,8 @@
         kwargs['display_name'] = display_name
     if kind is not None:
         kwargs['kind'] = kind
+    if target_service is not None:
+        kwargs['target_service'] = target_service
     if lifecycle_state is not None:
         kwargs['lifecycle_state'] = lifecycle_state
     if limit is not None:
@@ -6326,6 +6383,7 @@
 @cli_util.option('--sort-by', 
type=custom_types.CliCaseInsensitiveChoice(["timeCreated", "timeUpdated", 
"displayName"]), help=u"""The field to sort by. Only one sort order may be 
provided. Default order for timeCreated is descending. Default order for 
displayName is ascending. If no value is specified timeCreated is default.""")
 @cli_util.option('--saved-search-id', help=u"""A filter to return only 
scheduled tasks whose stream action savedSearchId matches the given 
ManagementSavedSearch id [OCID] exactly.""")
 @cli_util.option('--display-name-contains', help=u"""A filter to return only 
resources whose display name contains the substring.""")
+@cli_util.option('--target-service', help=u"""The target service to use for 
filtering.""")
 @cli_util.option('--all', 'all_pages', is_flag=True, help="""Fetches all pages 
of results. If you provide this option, then you cannot provide the --limit 
option.""")
 @cli_util.option('--page-size', type=click.INT, help="""When fetching results, 
the number of results to fetch per call. Only valid when used with --all or 
--limit, and ignored otherwise.""")
 @json_skeleton_utils.get_cli_json_input_option({})
@@ -6333,7 +6391,7 @@
 @click.pass_context
 
@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={},
 output_type={'module': 'log_analytics', 'class': 'ScheduledTaskCollection'})
 @cli_util.wrap_exceptions
-def list_scheduled_tasks(ctx, from_json, all_pages, page_size, namespace_name, 
task_type, compartment_id, limit, page, display_name, sort_order, sort_by, 
saved_search_id, display_name_contains):
+def list_scheduled_tasks(ctx, from_json, all_pages, page_size, namespace_name, 
task_type, compartment_id, limit, page, display_name, sort_order, sort_by, 
saved_search_id, display_name_contains, target_service):
 
     if all_pages and limit:
         raise click.UsageError('If you provide the --all option you cannot 
provide the --limit option')
@@ -6356,6 +6414,8 @@
         kwargs['saved_search_id'] = saved_search_id
     if display_name_contains is not None:
         kwargs['display_name_contains'] = display_name_contains
+    if target_service is not None:
+        kwargs['target_service'] = target_service
     kwargs['opc_request_id'] = 
cli_util.use_or_generate_request_id(ctx.obj['request_id'])
     client = cli_util.build_client('log_analytics', 'log_analytics', ctx)
     if all_pages:
@@ -6793,6 +6853,7 @@
 @cli_util.option('--limit', type=click.INT, help=u"""The maximum number of 
items to return.""")
 @cli_util.option('--page', help=u"""The page token representing the page at 
which to start retrieving results. This is usually retrieved from a previous 
list call.""")
 @cli_util.option('--name', help=u"""A filter to return only log analytics 
entities whose name matches the entire name given. The match is 
case-insensitive.""")
+@cli_util.option('--source-type', help=u"""The source type.""")
 @cli_util.option('--categories', help=u"""A comma-separated list of categories 
used for filtering""")
 @cli_util.option('--is-simplified', type=click.BOOL, help=u"""A flag 
specifying whether or not to return all source information, or a subset of the 
information about each source.  A value of true will return only the source 
unique identifier and the source name.  A value of false will return all source 
information (such as author, updated date, system flag, etc.)""")
 @cli_util.option('--all', 'all_pages', is_flag=True, help="""Fetches all pages 
of results. If you provide this option, then you cannot provide the --limit 
option.""")
@@ -6802,7 +6863,7 @@
 @click.pass_context
 
@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={},
 output_type={'module': 'log_analytics', 'class': 
'LogAnalyticsSourceCollection'})
 @cli_util.wrap_exceptions
-def list_sources(ctx, from_json, all_pages, page_size, namespace_name, 
compartment_id, entity_type, source_display_text, is_system, 
is_auto_associated, sort_order, sort_by, limit, page, name, categories, 
is_simplified):
+def list_sources(ctx, from_json, all_pages, page_size, namespace_name, 
compartment_id, entity_type, source_display_text, is_system, 
is_auto_associated, sort_order, sort_by, limit, page, name, source_type, 
categories, is_simplified):
 
     if all_pages and limit:
         raise click.UsageError('If you provide the --all option you cannot 
provide the --limit option')
@@ -6829,6 +6890,8 @@
         kwargs['page'] = page
     if name is not None:
         kwargs['name'] = name
+    if source_type is not None:
+        kwargs['source_type'] = source_type
     if categories is not None:
         kwargs['categories'] = categories
     if is_simplified is not None:
@@ -7991,6 +8054,7 @@
 @cli_util.option('--query-parameterconflict', help=u"""This is the query that 
identifies the recalled data.""")
 @cli_util.option('--purpose', help=u"""This is the purpose of the recall""")
 @cli_util.option('--is-recall-new-data-only', type=click.BOOL, help=u"""This 
indicates if only new data has to be recalled in this recall request""")
+@cli_util.option('--is-use-recommended-data-set', type=click.BOOL, 
help=u"""This indicates if user checked system recommended time range""")
 @cli_util.option('--if-match', help=u"""For optimistic concurrency control. In 
the PUT or DELETE call for a resource, set the `if-match` parameter to the 
value of the etag from a previous GET or POST response for that resource. The 
resource will be updated or deleted only if the etag you provide matches the 
resource's current etag value.""")
 @cli_util.option('--wait-for-state', 
type=custom_types.CliCaseInsensitiveChoice(["ACCEPTED", "IN_PROGRESS", 
"FAILED", "SUCCEEDED", "CANCELING", "CANCELED"]), multiple=True, help="""This 
operation asynchronously creates, modifies or deletes a resource and uses a 
work request to track the progress of the operation. Specify this option to 
perform the action and then wait until the work request reaches a certain 
state. Multiple states can be specified, returning on the first state. For 
example, --wait-for-state SUCCEEDED --wait-for-state FAILED would return on 
whichever lifecycle state is reached first. If timeout is reached, a return 
code of 2 is returned. For any other error, a return code of 1 is returned.""")
 @cli_util.option('--max-wait-seconds', type=click.INT, help="""The maximum 
time to wait for the work request to reach the state defined by 
--wait-for-state. Defaults to 1200 seconds.""")
@@ -8000,7 +8064,7 @@
 @click.pass_context
 
@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={},
 output_type={'module': 'log_analytics', 'class': 'RecalledDataInfo'})
 @cli_util.wrap_exceptions
-def recall_archived_data(ctx, from_json, wait_for_state, max_wait_seconds, 
wait_interval_seconds, namespace_name, compartment_id, time_data_ended, 
time_data_started, data_type, log_sets, query_parameterconflict, purpose, 
is_recall_new_data_only, if_match):
+def recall_archived_data(ctx, from_json, wait_for_state, max_wait_seconds, 
wait_interval_seconds, namespace_name, compartment_id, time_data_ended, 
time_data_started, data_type, log_sets, query_parameterconflict, purpose, 
is_recall_new_data_only, is_use_recommended_data_set, if_match):
 
     if isinstance(namespace_name, six.string_types) and 
len(namespace_name.strip()) == 0:
         raise click.UsageError('Parameter --namespace-name cannot be 
whitespace or empty string')
@@ -8030,6 +8094,9 @@
     if is_recall_new_data_only is not None:
         _details['isRecallNewDataOnly'] = is_recall_new_data_only
 
+    if is_use_recommended_data_set is not None:
+        _details['isUseRecommendedDataSet'] = is_use_recommended_data_set
+
     client = cli_util.build_client('log_analytics', 'log_analytics', ctx)
     result = client.recall_archived_data(
         namespace_name=namespace_name,
@@ -8068,7 +8135,7 @@
 
 
@log_analytics_lookup_group.command(name=cli_util.override('log_analytics.register_lookup.command_name',
 'register-lookup'), help=u"""Creates a lookup with the specified name, type 
and description. The csv file containing the lookup content is passed in as 
binary data in the request. \n[Command Reference](registerLookup)""")
 @cli_util.option('--namespace-name', required=True, help=u"""The Logging 
Analytics namespace used for the request.""")
-@cli_util.option('--type', required=True, 
type=custom_types.CliCaseInsensitiveChoice(["Lookup", "Dictionary"]), 
help=u"""The lookup type.  Valid values are Lookup or Dictionary.""")
+@cli_util.option('--type', required=True, 
type=custom_types.CliCaseInsensitiveChoice(["Lookup", "Dictionary", "Module"]), 
help=u"""The lookup type.  Valid values are Lookup, Dictionary or Module.""")
 @cli_util.option('--register-lookup-content-file-body', required=True, 
help=u"""file containing data for lookup creation""")
 @cli_util.option('--name', help=u"""A filter to return only log analytics 
entities whose name matches the entire name given. The match is 
case-insensitive.""")
 @cli_util.option('--description', help=u"""The description for a created 
lookup.""")
@@ -8117,6 +8184,7 @@
 @cli_util.option('--time-data-ended', required=True, 
type=custom_types.CLI_DATETIME, help=u"""This is the end of the time 
interval""" + custom_types.CLI_DATETIME.VALID_DATETIME_CLI_HELP_MESSAGE)
 @cli_util.option('--time-data-started', required=True, 
type=custom_types.CLI_DATETIME, help=u"""This is the start of the time 
interval""" + custom_types.CLI_DATETIME.VALID_DATETIME_CLI_HELP_MESSAGE)
 @cli_util.option('--data-type', 
type=custom_types.CliCaseInsensitiveChoice(["LOG", "LOOKUP"]), help=u"""This is 
the type of the recalled data to be released""")
+@cli_util.option('--collection-id', type=click.INT, help=u"""This is the id 
for the recalled data collection to be released. If specified, only this 
collection will be released""")
 @cli_util.option('--if-match', help=u"""For optimistic concurrency control. In 
the PUT or DELETE call for a resource, set the `if-match` parameter to the 
value of the etag from a previous GET or POST response for that resource. The 
resource will be updated or deleted only if the etag you provide matches the 
resource's current etag value.""")
 @cli_util.option('--wait-for-state', 
type=custom_types.CliCaseInsensitiveChoice(["ACCEPTED", "IN_PROGRESS", 
"FAILED", "SUCCEEDED", "CANCELING", "CANCELED"]), multiple=True, help="""This 
operation asynchronously creates, modifies or deletes a resource and uses a 
work request to track the progress of the operation. Specify this option to 
perform the action and then wait until the work request reaches a certain 
state. Multiple states can be specified, returning on the first state. For 
example, --wait-for-state SUCCEEDED --wait-for-state FAILED would return on 
whichever lifecycle state is reached first. If timeout is reached, a return 
code of 2 is returned. For any other error, a return code of 1 is returned.""")
 @cli_util.option('--max-wait-seconds', type=click.INT, help="""The maximum 
time to wait for the work request to reach the state defined by 
--wait-for-state. Defaults to 1200 seconds.""")
@@ -8126,7 +8194,7 @@
 @click.pass_context
 
@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={})
 @cli_util.wrap_exceptions
-def release_recalled_data(ctx, from_json, wait_for_state, max_wait_seconds, 
wait_interval_seconds, namespace_name, compartment_id, time_data_ended, 
time_data_started, data_type, if_match):
+def release_recalled_data(ctx, from_json, wait_for_state, max_wait_seconds, 
wait_interval_seconds, namespace_name, compartment_id, time_data_ended, 
time_data_started, data_type, collection_id, if_match):
 
     if isinstance(namespace_name, six.string_types) and 
len(namespace_name.strip()) == 0:
         raise click.UsageError('Parameter --namespace-name cannot be 
whitespace or empty string')
@@ -8144,6 +8212,9 @@
     if data_type is not None:
         _details['dataType'] = data_type
 
+    if collection_id is not None:
+        _details['collectionId'] = collection_id
+
     client = cli_util.build_client('log_analytics', 'log_analytics', ctx)
     result = client.release_recalled_data(
         namespace_name=namespace_name,
@@ -8534,6 +8605,7 @@
 @cli_util.option('--field-qualifier', help=u"""The parser field qualifier.""")
 @cli_util.option('--type', type=custom_types.CliCaseInsensitiveChoice(["XML", 
"JSON", "REGEX", "ODL", "DELIMITED"]), help=u"""The parser type.  Default value 
is REGEX.""")
 @cli_util.option('--is-namespace-aware', type=click.BOOL, help=u"""A flag 
indicating whether the XML parser should consider the namespace(s) while 
processing the log data.""")
+@cli_util.option('--is-position-aware', type=click.BOOL, help=u"""A flag 
indicating whether the parser is positionally aware.""")
 @cli_util.option('--scope', 
type=custom_types.CliCaseInsensitiveChoice(["LOG_LINES", "LOG_ENTRIES", 
"LOG_LINES_LOG_ENTRIES"]), help=u"""The scope used when testing a parser.""")
 @cli_util.option('--req-origin-module', help=u"""The module to test.  A value 
of 'ParserFunctionTest' will result in testing of the parser functions.""")
 @json_skeleton_utils.get_cli_json_input_option({'field-maps': {'module': 
'log_analytics', 'class': 'list[LogAnalyticsParserField]'}, 'metadata': 
{'module': 'log_analytics', 'class': 'UiParserTestMetadata'}, 
'parser-functions': {'module': 'log_analytics', 'class': 
'list[LogAnalyticsParserFunction]'}})
@@ -8541,7 +8613,7 @@
 @click.pass_context
 
@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={'field-maps':
 {'module': 'log_analytics', 'class': 'list[LogAnalyticsParserField]'}, 
'metadata': {'module': 'log_analytics', 'class': 'UiParserTestMetadata'}, 
'parser-functions': {'module': 'log_analytics', 'class': 
'list[LogAnalyticsParserFunction]'}}, output_type={'module': 'log_analytics', 
'class': 'ParserTestResult'})
 @cli_util.wrap_exceptions
-def test_parser(ctx, from_json, namespace_name, content, description, 
display_name, encoding, example_content, field_maps, footer_content, 
header_content, name, is_default, is_single_line_content, is_system, language, 
time_updated, log_type_test_request_version, metadata, 
parser_ignoreline_characters, is_hidden, parser_sequence, parser_timezone, 
is_parser_written_once, parser_functions, should_tokenize_original_text, 
field_delimiter, field_qualifier, type, is_namespace_aware, scope, 
req_origin_module):
+def test_parser(ctx, from_json, namespace_name, content, description, 
display_name, encoding, example_content, field_maps, footer_content, 
header_content, name, is_default, is_single_line_content, is_system, language, 
time_updated, log_type_test_request_version, metadata, 
parser_ignoreline_characters, is_hidden, parser_sequence, parser_timezone, 
is_parser_written_once, parser_functions, should_tokenize_original_text, 
field_delimiter, field_qualifier, type, is_namespace_aware, is_position_aware, 
scope, req_origin_module):
 
     if isinstance(namespace_name, six.string_types) and 
len(namespace_name.strip()) == 0:
         raise click.UsageError('Parameter --namespace-name cannot be 
whitespace or empty string')
@@ -8636,6 +8708,9 @@
     if is_namespace_aware is not None:
         _details['isNamespaceAware'] = is_namespace_aware
 
+    if is_position_aware is not None:
+        _details['isPositionAware'] = is_position_aware
+
     client = cli_util.build_client('log_analytics', 'log_analytics', ctx)
     result = client.test_parser(
         namespace_name=namespace_name,
@@ -9001,17 +9076,19 @@
 @cli_util.option('--properties', type=custom_types.CLI_COMPLEX_TYPE, 
help=u"""The name/value pairs for parameter values to be used in file patterns 
specified in log sources.""" + custom_types.cli_complex_type.COMPLEX_TYPE_HELP)
 @cli_util.option('--freeform-tags', type=custom_types.CLI_COMPLEX_TYPE, 
help=u"""Simple key-value pair that is applied without any predefined name, 
type or scope. Exists for cross-compatibility only. Example: `{\"bar-key\": 
\"value\"}`""" + custom_types.cli_complex_type.COMPLEX_TYPE_HELP)
 @cli_util.option('--defined-tags', type=custom_types.CLI_COMPLEX_TYPE, 
help=u"""Defined tags for this resource. Each key is predefined and scoped to a 
namespace. Example: `{\"foo-namespace\": {\"bar-key\": \"value\"}}`""" + 
custom_types.cli_complex_type.COMPLEX_TYPE_HELP)
+@cli_util.option('--time-last-discovered', type=custom_types.CLI_DATETIME, 
help=u"""The date and time the resource was last discovered, in the format 
defined by RFC3339.""" + 
custom_types.CLI_DATETIME.VALID_DATETIME_CLI_HELP_MESSAGE)
+@cli_util.option('--metadata', type=custom_types.CLI_COMPLEX_TYPE, 
help=u"""""" + custom_types.cli_complex_type.COMPLEX_TYPE_HELP)
 @cli_util.option('--if-match', help=u"""For optimistic concurrency control. In 
the PUT or DELETE call for a resource, set the `if-match` parameter to the 
value of the etag from a previous GET or POST response for that resource. The 
resource will be updated or deleted only if the etag you provide matches the 
resource's current etag value.""")
 @cli_util.option('--force', help="""Perform update without prompting for 
confirmation.""", is_flag=True)
 @cli_util.option('--wait-for-state', 
type=custom_types.CliCaseInsensitiveChoice(["ACTIVE", "DELETED"]), 
multiple=True, help="""This operation creates, modifies or deletes a resource 
that has a defined lifecycle state. Specify this option to perform the action 
and then wait until the resource reaches a given lifecycle state. Multiple 
states can be specified, returning on the first state. For example, 
--wait-for-state SUCCEEDED --wait-for-state FAILED would return on whichever 
lifecycle state is reached first. If timeout is reached, a return code of 2 is 
returned. For any other error, a return code of 1 is returned.""")
 @cli_util.option('--max-wait-seconds', type=click.INT, help="""The maximum 
time to wait for the resource to reach the lifecycle state defined by 
--wait-for-state. Defaults to 1200 seconds.""")
 @cli_util.option('--wait-interval-seconds', type=click.INT, help="""Check 
every --wait-interval-seconds to see whether the resource has reached the 
lifecycle state defined by --wait-for-state. Defaults to 30 seconds.""")
-@json_skeleton_utils.get_cli_json_input_option({'properties': {'module': 
'log_analytics', 'class': 'dict(str, string)'}, 'freeform-tags': {'module': 
'log_analytics', 'class': 'dict(str, string)'}, 'defined-tags': {'module': 
'log_analytics', 'class': 'dict(str, dict(str, object))'}})
+@json_skeleton_utils.get_cli_json_input_option({'properties': {'module': 
'log_analytics', 'class': 'dict(str, string)'}, 'freeform-tags': {'module': 
'log_analytics', 'class': 'dict(str, string)'}, 'defined-tags': {'module': 
'log_analytics', 'class': 'dict(str, dict(str, object))'}, 'metadata': 
{'module': 'log_analytics', 'class': 'LogAnalyticsMetadataDetails'}})
 @cli_util.help_option
 @click.pass_context
-@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={'properties':
 {'module': 'log_analytics', 'class': 'dict(str, string)'}, 'freeform-tags': 
{'module': 'log_analytics', 'class': 'dict(str, string)'}, 'defined-tags': 
{'module': 'log_analytics', 'class': 'dict(str, dict(str, object))'}}, 
output_type={'module': 'log_analytics', 'class': 'LogAnalyticsEntity'})
+@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={'properties':
 {'module': 'log_analytics', 'class': 'dict(str, string)'}, 'freeform-tags': 
{'module': 'log_analytics', 'class': 'dict(str, string)'}, 'defined-tags': 
{'module': 'log_analytics', 'class': 'dict(str, dict(str, object))'}, 
'metadata': {'module': 'log_analytics', 'class': 
'LogAnalyticsMetadataDetails'}}, output_type={'module': 'log_analytics', 
'class': 'LogAnalyticsEntity'})
 @cli_util.wrap_exceptions
-def update_log_analytics_entity(ctx, from_json, force, wait_for_state, 
max_wait_seconds, wait_interval_seconds, namespace_name, 
log_analytics_entity_id, name, management_agent_id, timezone_region, hostname, 
properties, freeform_tags, defined_tags, if_match):
+def update_log_analytics_entity(ctx, from_json, force, wait_for_state, 
max_wait_seconds, wait_interval_seconds, namespace_name, 
log_analytics_entity_id, name, management_agent_id, timezone_region, hostname, 
properties, freeform_tags, defined_tags, time_last_discovered, metadata, 
if_match):
 
     if isinstance(namespace_name, six.string_types) and 
len(namespace_name.strip()) == 0:
         raise click.UsageError('Parameter --namespace-name cannot be 
whitespace or empty string')
@@ -9019,8 +9096,8 @@
     if isinstance(log_analytics_entity_id, six.string_types) and 
len(log_analytics_entity_id.strip()) == 0:
         raise click.UsageError('Parameter --log-analytics-entity-id cannot be 
whitespace or empty string')
     if not force:
-        if properties or freeform_tags or defined_tags:
-            if not click.confirm("WARNING: Updates to properties and 
freeform-tags and defined-tags will replace any existing values. Are you sure 
you want to continue?"):
+        if properties or freeform_tags or defined_tags or metadata:
+            if not click.confirm("WARNING: Updates to properties and 
freeform-tags and defined-tags and metadata will replace any existing values. 
Are you sure you want to continue?"):
                 ctx.abort()
 
     kwargs = {}
@@ -9051,6 +9128,12 @@
     if defined_tags is not None:
         _details['definedTags'] = 
cli_util.parse_json_parameter("defined_tags", defined_tags)
 
+    if time_last_discovered is not None:
+        _details['timeLastDiscovered'] = time_last_discovered
+
+    if metadata is not None:
+        _details['metadata'] = cli_util.parse_json_parameter("metadata", 
metadata)
+
     client = cli_util.build_client('log_analytics', 'log_analytics', ctx)
     result = client.update_log_analytics_entity(
         namespace_name=namespace_name,
@@ -9726,6 +9809,7 @@
 @cli_util.option('--upload-discovery-data-details', required=True, 
help=u"""Discovery data""")
 @cli_util.option('--opc-meta-properties', help=u"""Metadata key and value 
pairs separated by a semicolon. Example k1:v1;k2:v2;k3:v3""")
 @cli_util.option('--discovery-data-type', 
type=custom_types.CliCaseInsensitiveChoice(["ENTITY", "K8S_OBJECTS"]), 
help=u"""Discovery data type""")
+@cli_util.option('--log-group-id', help=u"""The log group OCID that gets 
mapped to the logs in the discovery data.""")
 @cli_util.option('--payload-type', 
type=custom_types.CliCaseInsensitiveChoice(["JSON", "GZIP", "ZIP"]), 
help=u"""Identifies the type of request payload.""")
 @cli_util.option('--content-type', help=u"""The content type of the log 
data.""")
 @cli_util.option('--expect', help=u"""A value of `100-continue` requests 
preliminary verification of the request method, path, and headers before the 
request body is sent. If no error results from such verification, the server 
will send a 100 (Continue) interim response to indicate readiness for the 
request body. The only allowed value for this parameter is \"100-Continue\" 
(case-insensitive).""")
@@ -9734,7 +9818,7 @@
 @click.pass_context
 
@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={})
 @cli_util.wrap_exceptions
-def upload_discovery_data(ctx, from_json, namespace_name, 
upload_discovery_data_details, opc_meta_properties, discovery_data_type, 
payload_type, content_type, expect):
+def upload_discovery_data(ctx, from_json, namespace_name, 
upload_discovery_data_details, opc_meta_properties, discovery_data_type, 
log_group_id, payload_type, content_type, expect):
 
     if isinstance(namespace_name, six.string_types) and 
len(namespace_name.strip()) == 0:
         raise click.UsageError('Parameter --namespace-name cannot be 
whitespace or empty string')
@@ -9744,6 +9828,8 @@
         kwargs['opc_meta_properties'] = opc_meta_properties
     if discovery_data_type is not None:
         kwargs['discovery_data_type'] = discovery_data_type
+    if log_group_id is not None:
+        kwargs['log_group_id'] = log_group_id
     if payload_type is not None:
         kwargs['payload_type'] = payload_type
     if content_type is not None:
@@ -9771,13 +9857,14 @@
 @cli_util.option('--log-set', help=u"""The log set that gets associated with 
the uploaded logs.""")
 @cli_util.option('--payload-type', 
type=custom_types.CliCaseInsensitiveChoice(["JSON", "GZIP", "ZIP"]), 
help=u"""Identifies the type of request payload.""")
 @cli_util.option('--content-type', help=u"""The content type of the log 
data.""")
+@cli_util.option('--opc-meta-properties', help=u"""Metadata key and value 
pairs separated by a semicolon. Example k1:v1;k2:v2;k3:v3""")
 @cli_util.option('--expect', help=u"""A value of `100-continue` requests 
preliminary verification of the request method, path, and headers before the 
request body is sent. If no error results from such verification, the server 
will send a 100 (Continue) interim response to indicate readiness for the 
request body. The only allowed value for this parameter is \"100-Continue\" 
(case-insensitive).""")
 @json_skeleton_utils.get_cli_json_input_option({})
 @cli_util.help_option
 @click.pass_context
 
@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={})
 @cli_util.wrap_exceptions
-def upload_log_events_file(ctx, from_json, namespace_name, log_group_id, 
upload_log_events_file_details, log_set, payload_type, content_type, expect):
+def upload_log_events_file(ctx, from_json, namespace_name, log_group_id, 
upload_log_events_file_details, log_set, payload_type, content_type, 
opc_meta_properties, expect):
 
     if isinstance(namespace_name, six.string_types) and 
len(namespace_name.strip()) == 0:
         raise click.UsageError('Parameter --namespace-name cannot be 
whitespace or empty string')
@@ -9789,6 +9876,8 @@
         kwargs['payload_type'] = payload_type
     if content_type is not None:
         kwargs['content_type'] = content_type
+    if opc_meta_properties is not None:
+        kwargs['opc_meta_properties'] = opc_meta_properties
     if expect is not None:
         kwargs['expect'] = expect
     kwargs['opc_request_id'] = 
cli_util.use_or_generate_request_id(ctx.obj['request_id'])
@@ -10101,13 +10190,14 @@
 @cli_util.option('--categories', type=custom_types.CLI_COMPLEX_TYPE, 
help=u"""An array of categories to assign to the parser. Specifying the name 
attribute for each category would suffice. Oracle-defined category assignments 
cannot be removed.
 
 This option is a JSON list with items of type LogAnalyticsCategory.  For 
documentation on LogAnalyticsCategory please see our API reference: 
https://docs.cloud.oracle.com/api/#/en/loganalytics/20200601/datatypes/LogAnalyticsCategory.""";
 + custom_types.cli_complex_type.COMPLEX_TYPE_HELP)
+@cli_util.option('--is-position-aware', type=click.BOOL, help=u"""A flag 
indicating whether the parser is positionally aware.""")
 @cli_util.option('--if-match', help=u"""For optimistic concurrency control. In 
the PUT or DELETE call for a resource, set the `if-match` parameter to the 
value of the etag from a previous GET or POST response for that resource. The 
resource will be updated or deleted only if the etag you provide matches the 
resource's current etag value.""")
 @json_skeleton_utils.get_cli_json_input_option({'field-maps': {'module': 
'log_analytics', 'class': 'list[LogAnalyticsParserField]'}, 'parser-functions': 
{'module': 'log_analytics', 'class': 'list[LogAnalyticsParserFunction]'}, 
'categories': {'module': 'log_analytics', 'class': 
'list[LogAnalyticsCategory]'}})
 @cli_util.help_option
 @click.pass_context
 
@json_skeleton_utils.json_skeleton_generation_handler(input_params_to_complex_types={'field-maps':
 {'module': 'log_analytics', 'class': 'list[LogAnalyticsParserField]'}, 
'parser-functions': {'module': 'log_analytics', 'class': 
'list[LogAnalyticsParserFunction]'}, 'categories': {'module': 'log_analytics', 
'class': 'list[LogAnalyticsCategory]'}}, output_type={'module': 
'log_analytics', 'class': 'LogAnalyticsParser'})
 @cli_util.wrap_exceptions
-def upsert_parser(ctx, from_json, namespace_name, content, description, 
display_name, edit_version, encoding, example_content, field_maps, 
footer_content, header_content, name, is_default, is_single_line_content, 
is_system, language, log_type_test_request_version, 
parser_ignoreline_characters, parser_sequence, parser_timezone, 
is_parser_written_once, parser_functions, should_tokenize_original_text, 
field_delimiter, field_qualifier, type, is_namespace_aware, categories, 
if_match):
+def upsert_parser(ctx, from_json, namespace_name, content, description, 
display_name, edit_version, encoding, example_content, field_maps, 
footer_content, header_content, name, is_default, is_single_line_content, 
is_system, language, log_type_test_request_version, 
parser_ignoreline_characters, parser_sequence, parser_timezone, 
is_parser_written_once, parser_functions, should_tokenize_original_text, 
field_delimiter, field_qualifier, type, is_namespace_aware, categories, 
is_position_aware, if_match):
 
     if isinstance(namespace_name, six.string_types) and 
len(namespace_name.strip()) == 0:
         raise click.UsageError('Parameter --namespace-name cannot be 
whitespace or empty string')
@@ -10197,6 +10287,9 @@
     if categories is not None:
         _details['categories'] = cli_util.parse_json_parameter("categories", 
categories)
 
+    if is_position_aware is not None:
+        _details['isPositionAware'] = is_position_aware
+
     client = cli_util.build_client('log_analytics', 'log_analytics', ctx)
     result = client.upsert_parser(
         namespace_name=namespace_name,
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/oci-cli-3.37.8/services/management_agent/src/oci_cli_management_agent/generated/managementagent_cli.py
 
new/oci-cli-3.37.9/services/management_agent/src/oci_cli_management_agent/generated/managementagent_cli.py
--- 
old/oci-cli-3.37.8/services/management_agent/src/oci_cli_management_agent/generated/managementagent_cli.py
  2024-02-06 07:35:21.000000000 +0100
+++ 
new/oci-cli-3.37.9/services/management_agent/src/oci_cli_management_agent/generated/managementagent_cli.py
  2024-02-13 11:20:04.000000000 +0100
@@ -118,6 +118,10 @@
                     wait_period_kwargs['max_wait_seconds'] = max_wait_seconds
                 if wait_interval_seconds is not None:
                     wait_period_kwargs['max_interval_seconds'] = 
wait_interval_seconds
+                if 'opc-work-request-id' not in result.headers:
+                    click.echo('Encountered error while waiting for work 
request to enter the specified state. Outputting last known resource state')
+                    cli_util.render_response(result, ctx)
+                    return
 
                 click.echo('Action completed. Waiting until the work request 
has entered state: {}'.format(wait_for_state), file=sys.stderr)
                 result = oci.wait_until(client, 
client.get_work_request(result.headers['opc-work-request-id']), 'status', 
wait_for_state, **wait_period_kwargs)
@@ -217,6 +221,10 @@
                     wait_period_kwargs['max_wait_seconds'] = max_wait_seconds
                 if wait_interval_seconds is not None:
                     wait_period_kwargs['max_interval_seconds'] = 
wait_interval_seconds
+                if 'opc-work-request-id' not in result.headers:
+                    click.echo('Encountered error while waiting for work 
request to enter the specified state. Outputting last known resource state')
+                    cli_util.render_response(result, ctx)
+                    return
 
                 click.echo('Action completed. Waiting until the work request 
has entered state: {}'.format(wait_for_state), file=sys.stderr)
                 result = oci.wait_until(client, 
client.get_work_request(result.headers['opc-work-request-id']), 'status', 
wait_for_state, **wait_period_kwargs)
@@ -337,6 +345,10 @@
                     wait_period_kwargs['max_wait_seconds'] = max_wait_seconds
                 if wait_interval_seconds is not None:
                     wait_period_kwargs['max_interval_seconds'] = 
wait_interval_seconds
+                if 'opc-work-request-id' not in result.headers:
+                    click.echo('Encountered error while waiting for work 
request to enter the specified state. Outputting last known resource state')
+                    cli_util.render_response(result, ctx)
+                    return
 
                 click.echo('Action completed. Waiting until the work request 
has entered state: {}'.format(wait_for_state), file=sys.stderr)
                 result = oci.wait_until(client, 
client.get_work_request(result.headers['opc-work-request-id']), 'status', 
wait_for_state, **wait_period_kwargs)
@@ -1431,6 +1443,10 @@
                     wait_period_kwargs['max_wait_seconds'] = max_wait_seconds
                 if wait_interval_seconds is not None:
                     wait_period_kwargs['max_interval_seconds'] = 
wait_interval_seconds
+                if 'opc-work-request-id' not in result.headers:
+                    click.echo('Encountered error while waiting for work 
request to enter the specified state. Outputting last known resource state')
+                    cli_util.render_response(result, ctx)
+                    return
 
                 click.echo('Action completed. Waiting until the work request 
has entered state: {}'.format(wait_for_state), file=sys.stderr)
                 result = oci.wait_until(client, 
client.get_work_request(result.headers['opc-work-request-id']), 'status', 
wait_for_state, **wait_period_kwargs)
@@ -1534,6 +1550,10 @@
                     wait_period_kwargs['max_wait_seconds'] = max_wait_seconds
                 if wait_interval_seconds is not None:
                     wait_period_kwargs['max_interval_seconds'] = 
wait_interval_seconds
+                if 'opc-work-request-id' not in result.headers:
+                    click.echo('Encountered error while waiting for work 
request to enter the specified state. Outputting last known resource state')
+                    cli_util.render_response(result, ctx)
+                    return
 
                 click.echo('Action completed. Waiting until the work request 
has entered state: {}'.format(wait_for_state), file=sys.stderr)
                 result = oci.wait_until(client, 
client.get_work_request(result.headers['opc-work-request-id']), 'status', 
wait_for_state, **wait_period_kwargs)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/oci-cli-3.37.8/services/vault/src/oci_cli_vaults/generated/vaults_cli.py 
new/oci-cli-3.37.9/services/vault/src/oci_cli_vaults/generated/vaults_cli.py
--- 
old/oci-cli-3.37.8/services/vault/src/oci_cli_vaults/generated/vaults_cli.py    
    2024-02-06 07:35:21.000000000 +0100
+++ 
new/oci-cli-3.37.9/services/vault/src/oci_cli_vaults/generated/vaults_cli.py    
    2024-02-13 11:20:04.000000000 +0100
@@ -544,6 +544,10 @@
                     wait_period_kwargs['max_wait_seconds'] = max_wait_seconds
                 if wait_interval_seconds is not None:
                     wait_period_kwargs['max_interval_seconds'] = 
wait_interval_seconds
+                if 'opc-work-request-id' not in result.headers:
+                    click.echo('Encountered error while waiting for work 
request to enter the specified state. Outputting last known resource state')
+                    cli_util.render_response(result, ctx)
+                    return
 
                 click.echo('Action completed. Waiting until the work request 
has entered state: {}'.format(wait_for_state), file=sys.stderr)
                 result = oci.wait_until(work_request_client, 
work_request_client.get_work_request(result.headers['opc-work-request-id']), 
'status', wait_for_state, **wait_period_kwargs)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/oci-cli-3.37.8/setup.py new/oci-cli-3.37.9/setup.py
--- old/oci-cli-3.37.8/setup.py 2024-02-06 07:35:21.000000000 +0100
+++ new/oci-cli-3.37.9/setup.py 2024-02-13 11:20:04.000000000 +0100
@@ -30,7 +30,7 @@
     readme = f.read()
 
 requires = [
-    'oci==2.121.0',
+    'oci==2.121.1',
     'arrow>=1.0.0',
     'certifi',
     'click==8.0.4',
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/oci-cli-3.37.8/src/oci_cli/version.py 
new/oci-cli-3.37.9/src/oci_cli/version.py
--- old/oci-cli-3.37.8/src/oci_cli/version.py   2024-02-06 07:35:21.000000000 
+0100
+++ new/oci-cli-3.37.9/src/oci_cli/version.py   2024-02-13 11:20:04.000000000 
+0100
@@ -2,4 +2,4 @@
 # Copyright (c) 2016, 2021, Oracle and/or its affiliates.  All rights reserved.
 # This software is dual-licensed to you under the Universal Permissive License 
(UPL) 1.0 as shown at https://oss.oracle.com/licenses/upl or Apache License 2.0 
as shown at http://www.apache.org/licenses/LICENSE-2.0. You may choose either 
license.
 
-__version__ = '3.37.8'
+__version__ = '3.37.9'
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/oci-cli-3.37.8/tests/resources/json_ignore_command_list.txt 
new/oci-cli-3.37.9/tests/resources/json_ignore_command_list.txt
--- old/oci-cli-3.37.8/tests/resources/json_ignore_command_list.txt     
2024-02-06 07:35:21.000000000 +0100
+++ new/oci-cli-3.37.9/tests/resources/json_ignore_command_list.txt     
2024-02-13 11:20:04.000000000 +0100
@@ -283,4 +283,12 @@
 apm-synthetics, monitor, update-dns-trace-monitor
 apm-synthetics, monitor, update-dns-server-monitor
 adm, remediation-run, list-application-dependency-recommendations
-management-agent, work-request, list
\ No newline at end of file
+management-agent, work-request, list
+log-analytics, entity, create
+log-analytics, entity, update
+log-analytics, parser, extract-structured-log-field-paths
+log-analytics, parser, extract-structured-log-header-paths
+log-analytics, entity, list
+log-analytics, entity-topology, list
+log-analytics, rule, list
+log-analytics, source, list-sources
\ No newline at end of file

Reply via email to