phanikumv commented on code in PR #906:
URL: https://github.com/apache/airflow-site/pull/906#discussion_r1427557860


##########
landing-pages/site/content/en/blog/airflow-2.8.0/index.md:
##########
@@ -0,0 +1,114 @@
+---
+title: "Apache Airflow 2.8.0 is here"
+linkTitle: "Apache Airflow 2.8.0 is here"
+author: "Ephraim Anierobi"
+github: "ephraimbuddy"
+linkedin: "ephraimanierobi"
+description: "Introducing Apache Airflow 2.8.0: Enhanced with New Features and 
Significant Improvements"
+tags: [Release]
+date: "2023-12-15"
+---
+
+I am thrilled to announce the release of Apache Airflow 2.8.0, featuring a 
host of significant enhancements and new features that will greatly benefit our 
community.
+
+**Details**:
+
+📦 PyPI: https://pypi.org/project/apache-airflow/2.8.0/ \
+📚 Docs: https://airflow.apache.org/docs/apache-airflow/2.8.0/ \
+🛠 Release Notes: 
https://airflow.apache.org/docs/apache-airflow/2.8.0/release_notes.html \
+🐳 Docker Image: "docker pull apache/airflow:2.8.0" \
+🚏 Constraints: https://github.com/apache/airflow/tree/constraints-2.8.0
+
+## Airflow Object Storage (AIP-58)
+
+*This feature is experimental and subject to change.*
+
+Airflow now offers a generic abstraction layer over various object stores like 
S3, GCS, and Azure Blob Storage, enabling the use of different storage systems 
in DAGs without code modification.
+
+In addition, it allows you to use most of the standard Python modules, like 
shutil, that can work with file-like objects.
+
+Here is an example of how to use the new feature to open a file:
+
+```python
+from airflow.io.path import ObjectStoragePath
+
+base = ObjectStoragePath("s3://my-bucket/", conn_id="aws_default")  # conn_id 
is optional
+
+@task
+def read_file(path: ObjectStoragePath) -> str:
+    with path.open() as f:
+        return f.read()
+```
+
+The above example is just the tip of the iceberg. The new feature allows you 
to configure an alternative backend for a scheme or protocol.
+
+Here is an example of how to configure a custom backend for the `dbfs` scheme:
+
+```python
+from airflow.io.path import ObjectStoragePath
+from airflow.io.store import attach
+
+from fsspec.implementations.dbfs import DBFSFileSystem
+
+attach(protocol="dbfs", fs=DBFSFileSystem(instance="myinstance", 
token="mytoken"))
+base = ObjectStoragePath("dbfs://my-location/")
+```
+
+For more information: [Airflow Object 
Storage](https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/objectstorage.html)
+
+The support for a specific object storage system depends on the installed 
providers,
+with out-of-the-box support for the file scheme.
+
+## Ship logs from other components to Task logs
+This feature seamlessly integrates task-related messages from various Airflow 
components, including the Scheduler and
+Executors, into the task logs. This integration allows users to easily track 
error messages and other relevant
+information within a single log view.
+
+Presently, suppose a task is terminated by the scheduler before initiation, 
times out due to prolonged queuing, or transitions into a zombie state. In that 
case, it is not recorded in the task log. With this enhancement, in such 
situations,
+it becomes feasible to dispatch an error message to the task log for 
convenient visibility on the UI.
+
+This feature can be toggled, for more information [Look for 
“enable_task_context_logger” in the logging configuration 
documentation](https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#logging)
+
+## Listener hooks for Datasets

Review Comment:
   
   * Please note that listeners are still experimental and subject to change. *
   



##########
landing-pages/site/content/en/blog/airflow-2.8.0/index.md:
##########
@@ -0,0 +1,114 @@
+---
+title: "Apache Airflow 2.8.0 is here"
+linkTitle: "Apache Airflow 2.8.0 is here"
+author: "Ephraim Anierobi"
+github: "ephraimbuddy"
+linkedin: "ephraimanierobi"
+description: "Introducing Apache Airflow 2.8.0: Enhanced with New Features and 
Significant Improvements"
+tags: [Release]
+date: "2023-12-15"
+---
+
+I am thrilled to announce the release of Apache Airflow 2.8.0, featuring a 
host of significant enhancements and new features that will greatly benefit our 
community.
+
+**Details**:
+
+📦 PyPI: https://pypi.org/project/apache-airflow/2.8.0/ \
+📚 Docs: https://airflow.apache.org/docs/apache-airflow/2.8.0/ \
+🛠 Release Notes: 
https://airflow.apache.org/docs/apache-airflow/2.8.0/release_notes.html \
+🐳 Docker Image: "docker pull apache/airflow:2.8.0" \
+🚏 Constraints: https://github.com/apache/airflow/tree/constraints-2.8.0
+
+## Airflow Object Storage (AIP-58)
+
+*This feature is experimental and subject to change.*
+
+Airflow now offers a generic abstraction layer over various object stores like 
S3, GCS, and Azure Blob Storage, enabling the use of different storage systems 
in DAGs without code modification.
+
+In addition, it allows you to use most of the standard Python modules, like 
shutil, that can work with file-like objects.
+
+Here is an example of how to use the new feature to open a file:
+
+```python
+from airflow.io.path import ObjectStoragePath
+
+base = ObjectStoragePath("s3://my-bucket/", conn_id="aws_default")  # conn_id 
is optional
+
+@task
+def read_file(path: ObjectStoragePath) -> str:
+    with path.open() as f:
+        return f.read()
+```
+
+The above example is just the tip of the iceberg. The new feature allows you 
to configure an alternative backend for a scheme or protocol.
+
+Here is an example of how to configure a custom backend for the `dbfs` scheme:
+
+```python
+from airflow.io.path import ObjectStoragePath
+from airflow.io.store import attach
+
+from fsspec.implementations.dbfs import DBFSFileSystem
+
+attach(protocol="dbfs", fs=DBFSFileSystem(instance="myinstance", 
token="mytoken"))
+base = ObjectStoragePath("dbfs://my-location/")
+```
+
+For more information: [Airflow Object 
Storage](https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/objectstorage.html)
+
+The support for a specific object storage system depends on the installed 
providers,
+with out-of-the-box support for the file scheme.
+
+## Ship logs from other components to Task logs
+This feature seamlessly integrates task-related messages from various Airflow 
components, including the Scheduler and
+Executors, into the task logs. This integration allows users to easily track 
error messages and other relevant
+information within a single log view.
+
+Presently, suppose a task is terminated by the scheduler before initiation, 
times out due to prolonged queuing, or transitions into a zombie state. In that 
case, it is not recorded in the task log. With this enhancement, in such 
situations,
+it becomes feasible to dispatch an error message to the task log for 
convenient visibility on the UI.
+
+This feature can be toggled, for more information [Look for 
“enable_task_context_logger” in the logging configuration 
documentation](https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#logging)
+
+## Listener hooks for Datasets

Review Comment:
   Lets add this call out just below the header



##########
landing-pages/site/content/en/blog/airflow-2.8.0/index.md:
##########
@@ -0,0 +1,114 @@
+---
+title: "Apache Airflow 2.8.0 is here"
+linkTitle: "Apache Airflow 2.8.0 is here"
+author: "Ephraim Anierobi"
+github: "ephraimbuddy"
+linkedin: "ephraimanierobi"
+description: "Introducing Apache Airflow 2.8.0: Enhanced with New Features and 
Significant Improvements"
+tags: [Release]
+date: "2023-12-15"
+---
+
+I am thrilled to announce the release of Apache Airflow 2.8.0, featuring a 
host of significant enhancements and new features that will greatly benefit our 
community.
+
+**Details**:
+
+📦 PyPI: https://pypi.org/project/apache-airflow/2.8.0/ \
+📚 Docs: https://airflow.apache.org/docs/apache-airflow/2.8.0/ \
+🛠 Release Notes: 
https://airflow.apache.org/docs/apache-airflow/2.8.0/release_notes.html \
+🐳 Docker Image: "docker pull apache/airflow:2.8.0" \
+🚏 Constraints: https://github.com/apache/airflow/tree/constraints-2.8.0
+
+## Airflow Object Storage (AIP-58)
+
+*This feature is experimental and subject to change.*
+
+Airflow now offers a generic abstraction layer over various object stores like 
S3, GCS, and Azure Blob Storage, enabling the use of different storage systems 
in DAGs without code modification.
+
+In addition, it allows you to use most of the standard Python modules, like 
shutil, that can work with file-like objects.
+
+Here is an example of how to use the new feature to open a file:
+
+```python
+from airflow.io.path import ObjectStoragePath
+
+base = ObjectStoragePath("s3://my-bucket/", conn_id="aws_default")  # conn_id 
is optional
+
+@task
+def read_file(path: ObjectStoragePath) -> str:
+    with path.open() as f:
+        return f.read()
+```
+
+The above example is just the tip of the iceberg. The new feature allows you 
to configure an alternative backend for a scheme or protocol.
+
+Here is an example of how to configure a custom backend for the `dbfs` scheme:
+
+```python
+from airflow.io.path import ObjectStoragePath
+from airflow.io.store import attach
+
+from fsspec.implementations.dbfs import DBFSFileSystem
+
+attach(protocol="dbfs", fs=DBFSFileSystem(instance="myinstance", 
token="mytoken"))
+base = ObjectStoragePath("dbfs://my-location/")
+```
+
+For more information: [Airflow Object 
Storage](https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/objectstorage.html)
+
+The support for a specific object storage system depends on the installed 
providers,
+with out-of-the-box support for the file scheme.
+
+## Ship logs from other components to Task logs
+This feature seamlessly integrates task-related messages from various Airflow 
components, including the Scheduler and
+Executors, into the task logs. This integration allows users to easily track 
error messages and other relevant
+information within a single log view.
+
+Presently, suppose a task is terminated by the scheduler before initiation, 
times out due to prolonged queuing, or transitions into a zombie state. In that 
case, it is not recorded in the task log. With this enhancement, in such 
situations,
+it becomes feasible to dispatch an error message to the task log for 
convenient visibility on the UI.
+
+This feature can be toggled, for more information [Look for 
“enable_task_context_logger” in the logging configuration 
documentation](https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#logging)
+
+## Listener hooks for Datasets
+This feature enables users to subscribe to Dataset creation and update events 
using listener hooks.
+It’s particularly useful to trigger external processes based on a Dataset 
being created or updated.
+
+Please note that listeners are still experimental and subject to change.

Review Comment:
   ```suggestion
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to