This is an automated email from the ASF dual-hosted git repository.

skrawcz pushed a commit to branch docs/general_update
in repository https://gitbox.apache.org/repos/asf/burr.git


The following commit(s) were added to refs/heads/docs/general_update by this 
push:
     new 79a1866e Updates more references to Burr
79a1866e is described below

commit 79a1866e9891f1938df353f1e1f89bb85a8a4743
Author: Stefan Krawczyk <[email protected]>
AuthorDate: Sun Jul 6 09:51:47 2025 -0700

    Updates more references to Burr
---
 README.md                                          | 64 ++++++++++------------
 burr/core/application.py                           |  6 +-
 burr/core/state.py                                 |  2 +-
 burr/tracking/server/backend.py                    |  8 +--
 docs/concepts/hooks.rst                            |  2 +-
 docs/concepts/serde.rst                            |  2 +-
 docs/concepts/state-typing.rst                     |  2 +-
 docs/examples/agents/agent-patterns.md             |  8 +--
 docs/examples/agents/divide-and-conquer.md         |  4 +-
 docs/examples/chatbots/gpt-like-chatbot.ipynb      |  4 +-
 docs/examples/chatbots/rag-chatbot-hamilton.ipynb  |  2 +-
 docs/examples/data-science/ml_training.md          |  2 +-
 docs/examples/data-science/simulation.md           |  6 +-
 docs/examples/deployment/index.rst                 |  2 +-
 docs/examples/deployment/infrastructure.rst        |  8 +--
 docs/examples/deployment/monitoring.rst            | 10 ++--
 docs/examples/deployment/web-server.rst            | 14 ++---
 docs/examples/guardrails/creating_tests.rst        |  4 +-
 docs/getting_started/simple-example.rst            |  8 +--
 docs/main.rst                                      |  6 +-
 docs/reference/integrations/langchain.rst          |  8 +--
 .../conversational-rag/simple_example/README.md    |  2 +-
 examples/custom-serde/README.md                    |  2 +-
 examples/email-assistant/README.md                 |  2 +-
 examples/hamilton-integration/notebook.ipynb       |  2 +-
 examples/hello-world-counter/README.md             |  2 +-
 examples/image-telephone/README.md                 |  2 +-
 examples/llm-adventure-game/README.md              |  2 +-
 examples/ml-training/README.md                     |  2 +-
 examples/multi-agent-collaboration/README.md       |  4 +-
 .../multi-agent-collaboration/hamilton/README.md   |  4 +-
 examples/multi-agent-collaboration/lcel/README.md  |  2 +-
 examples/multi-modal-chatbot/README.md             |  2 +-
 examples/multi-modal-chatbot/burr_demo.ipynb       | 38 ++++++-------
 examples/other-examples/cowsay/README.md           |  2 +-
 examples/parallelism/README.md                     |  2 +-
 examples/parallelism/notebook.ipynb                |  4 +-
 examples/simple-chatbot-intro/README.md            |  2 +-
 examples/simulation/README.md                      |  4 +-
 examples/streaming-fastapi/notebook.ipynb          |  2 +-
 examples/streaming-overview/README.md              |  2 +-
 examples/test-case-creation/README.md              |  2 +-
 examples/tracing-and-spans/README.md               |  2 +-
 examples/tracing-and-spans/burr_otel_demo.ipynb    |  8 +--
 44 files changed, 130 insertions(+), 138 deletions(-)

diff --git a/README.md b/README.md
index 5f2bd117..ef8b685b 100644
--- a/README.md
+++ b/README.md
@@ -5,23 +5,17 @@
 
[![Discord](https://img.shields.io/badge/Join-Burr_Discord-7289DA?logo=discord)](https://discord.gg/6Zy2DwP4f3)
 
[![Downloads](https://static.pepy.tech/badge/burr/month)](https://pepy.tech/project/burr)
 ![PyPI Downloads](https://static.pepy.tech/badge/burr)
-[![GitHub Last 
Commit](https://img.shields.io/github/last-commit/dagworks-inc/burr)](https://github.com/dagworks-inc/burr/pulse)
-[![X](https://img.shields.io/badge/follow-%40burr_framework-1DA1F2?logo=x&style=social)](https://twitter.com/burr_framework)
-<a target="_blank" href="https://linkedin.com/showcase/dagworks-inc"; 
style="background:none">
-  <img 
src="https://img.shields.io/badge/DAGWorks-Follow-purple.svg?logo=linkedin"; />
-</a>
+[![GitHub Last 
Commit](https://img.shields.io/github/last-commit/apache/burr)](https://github.com/apache/burr/pulse)
+[![X](https://img.shields.io/badge/follow-%40burr_framework-1DA1F2?logo=x&style=social)](https://x.com/burr_framework)
 <a href="https://twitter.com/burr_framework"; target="_blank">
   <img 
src="https://img.shields.io/badge/burr_framework-Follow-purple.svg?logo=X"/>
 </a>
-<a href="https://twitter.com/dagworks"; target="_blank">
-  <img src="https://img.shields.io/badge/DAGWorks-Follow-purple.svg?logo=X"/>
-</a>
 
 </div>
 
-Burr makes it easy to develop applications that make decisions (chatbots, 
agents, simulations, etc...) from simple python building blocks.
+Apache Burr (incubating) makes it easy to develop applications that make 
decisions (chatbots, agents, simulations, etc...) from simple python building 
blocks.
 
-Burr works well for any application that uses LLMs, and can integrate with any 
of your favorite frameworks. Burr includes a UI that can track/monitor/trace 
your system in real time, along with
+Apache Burr works well for any application that uses LLMs, and can integrate 
with any of your favorite frameworks. Apache Burr includes a UI that can 
track/monitor/trace your system in real time, along with
 pluggable persisters (e.g. for memory) to save & load application state.
 
 Link to [documentation](https://burr.dagworks.io/). Quick (<3min) video intro 
[here](https://www.loom.com/share/a10f163428b942fea55db1a84b1140d8?sid=1512863b-f533-4a42-a2f3-95b13deb07c9).
@@ -43,7 +37,7 @@ Then run the UI server:
 burr
 ```
 
-This will open up Burr's telemetry UI. It comes loaded with some default data 
so you can click around.
+This will open up Apache Burr's telemetry UI. It comes loaded with some 
default data so you can click around.
 It also has a demo chat application to help demonstrate what the UI captures 
enabling you too see things changing in
 real-time. Hit the "Demos" side bar on the left and select `chatbot`. To chat 
it requires the `OPENAI_API_KEY`
 environment variable to be set, but you can still see how it works if you 
don't have an API key set.
@@ -51,7 +45,7 @@ environment variable to be set, but you can still see how it 
works if you don't
 Next, start coding / running examples:
 
 ```bash
-git clone https://github.com/dagworks-inc/burr && cd 
burr/examples/hello-world-counter
+git clone https://github.com/apache/burr && cd 
burr/examples/hello-world-counter
 python application.py
 ```
 
@@ -60,12 +54,12 @@ See if you can find it.
 
 For more details see the [getting started 
guide](https://burr.dagworks.io/getting_started/simple-example/).
 
-## πŸ”© How does Burr work?
+## πŸ”© How does Apache Burr work?
 
-With Burr you express your application as a state machine (i.e. a 
graph/flowchart).
+With Apache Burr you express your application as a state machine (i.e. a 
graph/flowchart).
 You can (and should!) use it for anything in which you have to manage state, 
track complex decisions, add human feedback, or dictate an idempotent, 
self-persisting workflow.
 
-The core API is simple -- the Burr hello-world looks like this (plug in your 
own LLM, or copy from [the 
docs](https://burr.dagworks.io/getting_started/simple-example/#build-a-simple-chatbot>)
 for _gpt-X_)
+The core API is simple -- the Apache Burr hello-world looks like this (plug in 
your own LLM, or copy from [the 
docs](https://burr.dagworks.io/getting_started/simple-example/#build-a-simple-chatbot>)
 for _gpt-X_)
 
 ```python
 from burr.core import action, State, ApplicationBuilder
@@ -79,7 +73,7 @@ def human_input(state: State, prompt: str) -> State:
 @action(reads=["chat_history"], writes=["response", "chat_history"])
 def ai_response(state: State) -> State:
     # query the LLM however you want (or don't use an LLM, up to you...)
-    response = _query_llm(state["chat_history"]) # Burr doesn't care how you 
use LLMs!
+    response = _query_llm(state["chat_history"]) # Apache Burr doesn't care 
how you use LLMs!
     chat_item = {"role" : "system", "content" : response}
     return state.update(response=content).append(chat_history=chat_item)
 
@@ -97,33 +91,33 @@ app = (
 print("answer:", app.state["response"])
 ```
 
-Burr includes:
+Apache Burr includes:
 
 1. A (dependency-free) low-abstraction python library that enables you to 
build and manage state machines with simple python functions
 2. A UI you can use view execution telemetry for introspection and debugging
 3. A set of integrations to make it easier to persist state, connect to 
telemetry, and integrate with other systems
 
-![Burr at work](https://github.com/DAGWorks-Inc/burr/blob/main/chatbot.gif)
+![Apache Burr at work](https://github.com/apache/burr/blob/main/chatbot.gif)
 
-## πŸ’»οΈ What can you do with Burr?
+## πŸ’»οΈ What can you do with Apache Burr?
 
-Burr can be used to power a variety of applications, including:
+Apache Burr can be used to power a variety of applications, including:
 
-1. [A simple gpt-like 
chatbot](https://github.com/dagworks-inc/burr/tree/main/examples/multi-modal-chatbot)
-2. [A stateful RAG-based 
chatbot](https://github.com/dagworks-inc/burr/tree/main/examples/conversational-rag/simple_example)
-3. [An LLM-based adventure 
game](https://github.com/DAGWorks-Inc/burr/tree/main/examples/llm-adventure-game)
-4. [An interactive assistant for writing 
emails](https://github.com/DAGWorks-Inc/burr/tree/main/examples/email-assistant)
+1. [A simple gpt-like 
chatbot](https://github.com/apache/burr/tree/main/examples/multi-modal-chatbot)
+2. [A stateful RAG-based 
chatbot](https://github.com/apache/burr/tree/main/examples/conversational-rag/simple_example)
+3. [An LLM-based adventure 
game](https://github.com/apache/burr/tree/main/examples/llm-adventure-game)
+4. [An interactive assistant for writing 
emails](https://github.com/apache/burr/tree/main/examples/email-assistant)
 
-As well as a variety of (non-LLM) use-cases, including a time-series 
forecasting 
[simulation](https://github.com/DAGWorks-Inc/burr/tree/main/examples/simulation),
-and [hyperparameter 
tuning](https://github.com/DAGWorks-Inc/burr/tree/main/examples/ml-training).
+As well as a variety of (non-LLM) use-cases, including a time-series 
forecasting 
[simulation](https://github.com/apache/burr/tree/main/examples/simulation),
+and [hyperparameter 
tuning](https://github.com/apache/burr/tree/main/examples/ml-training).
 
 And a lot more!
 
 Using hooks and other integrations you can (a) integrate with any of your 
favorite vendors (LLM observability, storage, etc...), and
 (b) build custom actions that delegate to your favorite libraries (like 
[Hamilton](https://github.com/DAGWorks-Inc/hamilton)).
 
-Burr will _not_ tell you how to build your models, how to query APIs, or how 
to manage your data. It will help you tie all these together
-in a way that scales with your needs and makes following the logic of your 
system easy. Burr comes out of the box with a host of integrations
+Apache Burr will _not_ tell you how to build your models, how to query APIs, 
or how to manage your data. It will help you tie all these together
+in a way that scales with your needs and makes following the logic of your 
system easy. Apache Burr comes out of the box with a host of integrations
 including tooling to build a UI in streamlit and watch your state machine 
execute.
 
 ## πŸ— Start building
@@ -133,9 +127,9 @@ Then read through some of the concepts and write your own 
application!
 
 ## πŸ“ƒ Comparison against common frameworks
 
-While Burr is attempting something (somewhat) unique, there are a variety of 
tools that occupy similar spaces:
+While Apache Burr is attempting something (somewhat) unique, there are a 
variety of tools that occupy similar spaces:
 
-| Criteria                                          | Burr | Langgraph | 
temporal | Langchain | Superagent | Hamilton |
+| Criteria                                          | Apache Burr | Langgraph 
| temporal | Langchain | Superagent | Hamilton |
 | ------------------------------------------------- | :--: | :-------: | 
:------: | :-------: | :--------: | :------: |
 | Explicitly models a state machine                 |  βœ…  |    βœ…     |    ❌    
|    ❌     |     ❌     |    ❌    |
 | Framework-agnostic                                |  βœ…  |    βœ…     |    βœ…    
|    βœ…     |     ❌     |    βœ…    |
@@ -146,7 +140,7 @@ While Burr is attempting something (somewhat) unique, there 
are a variety of too
 
 ## 🌯 Why the name Burr?
 
-Burr is named after [Aaron Burr](https://en.wikipedia.org/wiki/Aaron_Burr), 
founding father, third VP of the United States, and murderer/arch-nemesis of 
[Alexander Hamilton](https://en.wikipedia.org/wiki/Alexander_Hamilton).
+Apache Burr is named after [Aaron 
Burr](https://en.wikipedia.org/wiki/Aaron_Burr), founding father, third VP of 
the United States, and murderer/arch-nemesis of [Alexander 
Hamilton](https://en.wikipedia.org/wiki/Alexander_Hamilton).
 What's the connection with Hamilton? This is [DAGWorks](www.dagworks.io)' 
second open-source library release after the [Hamilton 
library](https://github.com/dagworks-inc/hamilton)
 We imagine a world in which Burr and Hamilton lived in harmony and saw through 
their differences to better the union. We originally
 built Burr as a _harness_ to handle state between executions of Hamilton DAGs 
(because DAGs don't have cycles),
@@ -198,18 +192,16 @@ but realized that it has a wide array of applications and 
decided to release it
 
 ## πŸ›£ Roadmap
 
-While Burr is stable and well-tested, we have quite a few tools/features on 
our roadmap!
-1. FastAPI integration + hosted deployment -- make it really easy to get Burr 
in an app in production without thinking about REST APIs
+While Apache Burr is stable and well-tested, we have quite a few 
tools/features on our roadmap!
+1. FastAPI integration + hosted deployment -- make it really easy to get 
Apache Burr in an app in production without thinking about REST APIs
 2. Various efficiency/usability improvements for the core library (see 
[planned capabilities](https://burr.dagworks.io/concepts/planned-capabilities/) 
for more details). This includes:
    1. First-class support for retries + exception management
    2. More integration with popular frameworks (LCEL, LLamaIndex, Hamilton, 
etc...)
    3. Capturing & surfacing extra metadata, e.g. annotations for particular 
point in time, that you can then pull out for fine-tuning, etc.
    4. Improvements to the pydantic-based typing system
 3. Tooling for hosted execution of state machines, integrating with your 
infrastructure (Ray, modal, FastAPI + EC2, etc...)
-4. Additional storage integrations. More integrations with technologies like 
MySQL, S3, etc. so you can run Burr on top of what you have available.
+4. Additional storage integrations. More integrations with technologies like 
MySQL, S3, etc. so you can run Apache Burr on top of what you have available.
 
-If you want to avoid self-hosting the above solutions we're building Burr 
Cloud. To let us know you're interested
-sign up [here](https://forms.gle/w9u2QKcPrztApRedA) for the waitlist to get 
access.
 
 ## 🀲 Contributing
 
diff --git a/burr/core/application.py b/burr/core/application.py
index c77ba930..d6c406aa 100644
--- a/burr/core/application.py
+++ b/burr/core/application.py
@@ -184,9 +184,9 @@ def _state_update(state_to_modify: State, modified_state: 
State) -> State:
 
     This is suboptimal -- we should not be observing the state, we should be 
using the state commands and layering in deltas.
     That said, we currently eagerly evaluate the state at all operations, 
which means we have to do it this way. See
-    https://github.com/DAGWorks-Inc/burr/issues/33 for a more detailed plan.
+    https://github.com/apache/burr/issues/33 for a more detailed plan.
 
-    This function was written to solve this issue: 
https://github.com/DAGWorks-Inc/burr/issues/28.
+    This function was written to solve this issue: 
https://github.com/apache/burr/issues/28.
 
 
     :param state_subset_pre_update: The subset of state passed to the update() 
function
@@ -835,7 +835,7 @@ class Application(Generic[ApplicationStateType]):
         )
 
     # @telemetry.capture_function_usage # todo -- capture usage when we break 
this up into one that isn't called internally
-    # This will be doable when we move sequence ID to the beginning of the 
function https://github.com/DAGWorks-Inc/burr/pull/73
+    # This will be doable when we move sequence ID to the beginning of the 
function https://github.com/apache/burr/pull/73
     @_call_execute_method_pre_post(ExecuteMethod.step)
     def step(self, inputs: Optional[Dict[str, Any]] = None) -> 
Optional[Tuple[Action, dict, State]]:
         """Performs a single step, advancing the state machine along.
diff --git a/burr/core/state.py b/burr/core/state.py
index 85e89a8d..310ed029 100644
--- a/burr/core/state.py
+++ b/burr/core/state.py
@@ -290,7 +290,7 @@ class State(Mapping, Generic[StateType]):
             # This ensures we only copy the fields that are read by value
             # and copy the others by value
             # TODO -- make this more efficient when we have immutable 
transactions
-            # with event-based history: 
https://github.com/DAGWorks-Inc/burr/issues/33
+            # with event-based history: 
https://github.com/apache/burr/issues/33
             if field in new_state:
                 # currently the reads() includes optional fields
                 # We should clean that up, but this is an internal API so not 
worried now
diff --git a/burr/tracking/server/backend.py b/burr/tracking/server/backend.py
index 2faa06cb..32e4f06c 100644
--- a/burr/tracking/server/backend.py
+++ b/burr/tracking/server/backend.py
@@ -240,10 +240,10 @@ def safe_json_load(line: bytes):
 
 def get_uri(project_id: str) -> str:
     project_id_map = {
-        "demo_counter": 
"https://github.com/DAGWorks-Inc/burr/tree/main/examples/hello-world-counter";,
-        "demo_tracing": 
"https://github.com/DAGWorks-Inc/burr/tree/main/examples/tracing-and-spans/application.py";,
-        "demo_chatbot": 
"https://github.com/DAGWorks-Inc/burr/tree/main/examples/multi-modal-chatbot";,
-        "demo_conversational-rag": 
"https://github.com/DAGWorks-Inc/burr/tree/main/examples/conversational-rag";,
+        "demo_counter": 
"https://github.com/apache/burr/tree/main/examples/hello-world-counter";,
+        "demo_tracing": 
"https://github.com/apache/burr/tree/main/examples/tracing-and-spans/application.py";,
+        "demo_chatbot": 
"https://github.com/apache/burr/tree/main/examples/multi-modal-chatbot";,
+        "demo_conversational-rag": 
"https://github.com/apache/burr/tree/main/examples/conversational-rag";,
     }
     return project_id_map.get(project_id, "")
 
diff --git a/docs/concepts/hooks.rst b/docs/concepts/hooks.rst
index 38f96edd..30bbb43f 100644
--- a/docs/concepts/hooks.rst
+++ b/docs/concepts/hooks.rst
@@ -8,7 +8,7 @@ Hooks
     Hooks allow you to customize every aspect of Burr's execution, plugging in 
whatever tooling,
     observability framework, or debugging tool you need.
 
-Apache Burr has a system of lifecycle adapters (adapted from the similar 
`Hamilton <https://github.com/dagworks-inc/hamilton>`_ concept), which allow 
you to run tooling before and after
+Apache Burr has a system of lifecycle adapters (adapted from the similar 
`Hamilton <https://github.com/apache/hamilton>`_ concept), which allow you to 
run tooling before and after
 various places in a node's execution. For instance, you could:
 
 1. Log every step as a trace in datadog
diff --git a/docs/concepts/serde.rst b/docs/concepts/serde.rst
index 37a84748..06e5df43 100644
--- a/docs/concepts/serde.rst
+++ b/docs/concepts/serde.rst
@@ -29,7 +29,7 @@ Here's a video walkthrough of how to add custom type and 
field serialization/des
         <iframe width="800" height="455" 
src="https://www.youtube.com/embed/Squ5IAeQBzc?si=6l6e0SJ0EqEjAW2K"; 
title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; 
clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" 
referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>
     </div>
 
-See `this example 
<https://github.com/DAGWorks-Inc/burr/tree/main/examples/custom-serde>`_ for 
the notebook.
+See `this example 
<https://github.com/apache/burr/tree/main/examples/custom-serde>`_ for the 
notebook.
 
 Type based serialization/deserialization
 _____________________________________________________
diff --git a/docs/concepts/state-typing.rst b/docs/concepts/state-typing.rst
index 59276c81..1edefeb5 100644
--- a/docs/concepts/state-typing.rst
+++ b/docs/concepts/state-typing.rst
@@ -11,7 +11,7 @@ Typing State
     the state of your application prior to execution (for use by a 
web-server/other typing system). This is done
     through the use of pydantic schemas.
 
-For a quick-start guide, see the `example 
<https://github.com/DAGWorks-Inc/burr/tree/main/examples/typed-state>`_
+For a quick-start guide, see the `example 
<https://github.com/apache/burr/tree/main/examples/typed-state>`_
 
 
 Apache Burr has two approaches to specifying a schema for a state. These can 
work together as long as they specify clashing state:
diff --git a/docs/examples/agents/agent-patterns.md 
b/docs/examples/agents/agent-patterns.md
index 8f1df814..e89c3bf4 100644
--- a/docs/examples/agents/agent-patterns.md
+++ b/docs/examples/agents/agent-patterns.md
@@ -7,25 +7,25 @@ We have the following templates:
 
 ## Multimodal agent
 
-[Code 
template](https://github.com/DAGWorks-Inc/burr/tree/main/examples/templates/multi_modal_agent.py)
+[Code 
template](https://github.com/apache/burr/tree/main/examples/templates/multi_modal_agent.py)
 
 ![](./_agent_patterns/multi_modal_agent.png)
 
 ## Multi-agent collaboration
 
-[Code 
template](https://github.com/DAGWorks-Inc/burr/tree/main/examples/templates/multi_agent_collaboration.py)
+[Code 
template](https://github.com/apache/burr/tree/main/examples/templates/multi_agent_collaboration.py)
 
 ![](./_agent_patterns/multi_agent_collaboration.png)
 
 ## Supervisor agent
 
-[Code 
template](https://github.com/DAGWorks-Inc/burr/tree/main/examples/templates/agent_supervisor.py)
+[Code 
template](https://github.com/apache/burr/tree/main/examples/templates/agent_supervisor.py)
 
 ![](./_agent_patterns/agent_supervisor.png)
 
 
 ## Hierarchical teams
 
-[Code 
template](https://github.com/DAGWorks-Inc/burr/tree/main/examples/templates/hierarchical_agent_teams.py)
+[Code 
template](https://github.com/apache/burr/tree/main/examples/templates/hierarchical_agent_teams.py)
 
 ![](./_agent_patterns/hierarchical_agent_teams.png)
diff --git a/docs/examples/agents/divide-and-conquer.md 
b/docs/examples/agents/divide-and-conquer.md
index cf6c4dfe..5b3ead77 100644
--- a/docs/examples/agents/divide-and-conquer.md
+++ b/docs/examples/agents/divide-and-conquer.md
@@ -5,11 +5,11 @@ A single agent can usually operate effectively using a 
handful of tools within a
 One way to approach complicated tasks is through a "divide-and-conquer" 
approach: create a "specialized agent" for each task or domain and route tasks 
to the correct "expert". This means that each agent can become a sequence of 
LLM calls that chooses how to use a specific "tool".
 
 The examples we link to below are inspired by the paper [AutoGen: Enabling 
Next-Gen LLM Applications via Multi-Agent Conversation by Wu, et. 
al.](https://arxiv.org/abs/2308.08155).
-They can be found in [this part of our 
repository](https://github.com/DAGWorks-Inc/burr/tree/main/examples/multi-agent-collaboration).
+They can be found in [this part of our 
repository](https://github.com/apache/burr/tree/main/examples/multi-agent-collaboration).
 
 ![](./_divide-and-conquer.png)
 
-We provide two implementations of this idea: with 
[Hamilton](https://github.com/DAGWorks-Inc/burr/tree/main/examples/multi-agent-collaboration/hamilton)
 and with [LangChain 
LCEL](https://github.com/DAGWorks-Inc/burr/tree/main/examples/multi-agent-collaboration/lcel).
 From Burr's standpoint, both look similar and you're free to use your 
preferred framework within an `action`.
+We provide two implementations of this idea: with 
[Hamilton](https://github.com/apache/burr/tree/main/examples/multi-agent-collaboration/hamilton)
 and with [LangChain 
LCEL](https://github.com/apache/burr/tree/main/examples/multi-agent-collaboration/lcel).
 From Burr's standpoint, both look similar and you're free to use your 
preferred framework within an `action`.
 
 ## With Hamilton
 
diff --git a/docs/examples/chatbots/gpt-like-chatbot.ipynb 
b/docs/examples/chatbots/gpt-like-chatbot.ipynb
index 5503d0b9..b3b7e49a 100644
--- a/docs/examples/chatbots/gpt-like-chatbot.ipynb
+++ b/docs/examples/chatbots/gpt-like-chatbot.ipynb
@@ -1001,10 +1001,10 @@
     "\n",
     "More blogs @ `blog.dagworks.io` e.g. [async & 
streaming](https://blog.dagworks.io/p/streaming-chatbot-with-burr-fastapi)\n",
     "\n",
-    "More 
[examples](https://github.com/DAGWorks-Inc/burr/tree/main/examples/):\n",
+    "More [examples](https://github.com/apache/burr/tree/main/examples/):\n",
     "\n",
     "- e.g. [test case 
creation](https://burr.dagworks.io/examples/guardrails/creating_tests/)\n",
-    "- e.g. [multi-agent 
collaboration](https://github.com/DAGWorks-Inc/burr/tree/main/examples/multi-agent-collaboration)\n",
+    "- e.g. [multi-agent 
collaboration](https://github.com/apache/burr/tree/main/examples/multi-agent-collaboration)\n",
     "\n",
     "Follow on Twitter & LinkedIn:\n",
     "\n",
diff --git a/docs/examples/chatbots/rag-chatbot-hamilton.ipynb 
b/docs/examples/chatbots/rag-chatbot-hamilton.ipynb
index 870e2b78..f4c51a7c 100644
--- a/docs/examples/chatbots/rag-chatbot-hamilton.ipynb
+++ b/docs/examples/chatbots/rag-chatbot-hamilton.ipynb
@@ -12,7 +12,7 @@
    "source": [
     "# Conversational RAG with Burr and Hamilton\n",
     "\n",
-    "See [GitHub 
example](https://github.com/DAGWorks-Inc/burr/tree/main/examples/conversational-rag/simple_example)
 and the accompanying video walkthrough:\n",
+    "See [GitHub 
example](https://github.com/apache/burr/tree/main/examples/conversational-rag/simple_example)
 and the accompanying video walkthrough:\n",
     "\n",
     "<div>\n",
     "    <iframe width=\"600\" height=\"380\" 
src=\"https://www.youtube.com/embed/t54DCiOH270?si=QpPNs7m2t0L0V8Va\"; 
title=\"YouTube video player\" frameborder=\"0\" allow=\"accelerometer; 
autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; 
web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" 
allowfullscreen></iframe>\n",
diff --git a/docs/examples/data-science/ml_training.md 
b/docs/examples/data-science/ml_training.md
index 9ab034e9..0a5de4d5 100644
--- a/docs/examples/data-science/ml_training.md
+++ b/docs/examples/data-science/ml_training.md
@@ -5,7 +5,7 @@ to gaussian processes, ``burr`` is a simple way to implement 
robust, failure-res
 track of hyperparameters and metrics and feeding these into a decision 
function to determine where to look next.
 ``burr``'s state machine can help you write and manage that process.
 
-This is a WIP! Please see the placeholder/example sketch in the 
[repository](https://github.com/DAGWorks-Inc/burr/tree/main/examples/ml-training)
 and contribute back if you have ideas via associated issue 
[here](https://github.com/DAGWorks-Inc/burr/issues/138).
+This is a WIP! Please see the placeholder/example sketch in the 
[repository](https://github.com/apache/burr/tree/main/examples/ml-training) and 
contribute back if you have ideas via associated issue 
[here](https://github.com/apache/burr/issues/138).
 
 ## High-level view
 
diff --git a/docs/examples/data-science/simulation.md 
b/docs/examples/data-science/simulation.md
index e9c789eb..f43306e5 100644
--- a/docs/examples/data-science/simulation.md
+++ b/docs/examples/data-science/simulation.md
@@ -5,8 +5,8 @@ The user then manages the state, which becomes the input to the 
next time step,
 as output data to analyze. Burr provides a simple way for you to construct, 
manage, and
 introspect the state of your simulation.
 
-This example is a WIP -- see the placeholder/example sketch in the 
[repository](https://github.com/DAGWorks-Inc/burr/tree/main/examples/simulation).
-We're actively looking for contributors + ideas via [this 
issue](https://github.com/DAGWorks-Inc/burr/issues/136) to track.
+This example is a WIP -- see the placeholder/example sketch in the 
[repository](https://github.com/apache/burr/tree/main/examples/simulation).
+We're actively looking for contributors + ideas via [this 
issue](https://github.com/apache/burr/issues/136) to track.
 
 
 For instance:
@@ -32,7 +32,7 @@ This is a special case of time-series forecasting, in which 
one wants to simulat
 - `construct_portfolio` - uses the forecast to construct a portfolio
 - `evaluate_portfolio` - evaluates the portfolio
 
-Each one of these could be a DAG using 
[Hamilton](https://github.com/dagworks-inc/hamilton), or running any custom 
code.
+Each one of these could be a DAG using 
[Hamilton](https://github.com/apache/hamilton), or running any custom code.
 
 ## Multi-agent simulation
 
diff --git a/docs/examples/deployment/index.rst 
b/docs/examples/deployment/index.rst
index 4bed667a..16912dbc 100644
--- a/docs/examples/deployment/index.rst
+++ b/docs/examples/deployment/index.rst
@@ -12,7 +12,7 @@ To deploy a Burr application in production, you need to do 
three things:
 3. Monitor your application in production (highly recommended, but not 
required)
 
 Due to the large number of methods people have for deploying applications, we 
will not cover all of them here. That said,
-we really appreciate contributions! Please `open an issue 
<https://github.com/DAGWorks-Inc/burr/issues/new?assignees=&labels=&projects=&template=feature_request.md&title=>`_
 if there's an example you'd like, and :ref:`contribute back <contributing>` if 
you
+we really appreciate contributions! Please `open an issue 
<https://github.com/apache/burr/issues/new?assignees=&labels=&projects=&template=feature_request.md&title=>`_
 if there's an example you'd like, and :ref:`contribute back <contributing>` if 
you
 have an example that would add to this guide. We have created a variety of 
issues with placeholders and link to them in the docs.
 
 .. toctree::
diff --git a/docs/examples/deployment/infrastructure.rst 
b/docs/examples/deployment/infrastructure.rst
index 34dd2147..503780b3 100644
--- a/docs/examples/deployment/infrastructure.rst
+++ b/docs/examples/deployment/infrastructure.rst
@@ -3,14 +3,14 @@ Provisioning Infrastructure/Deploying
 -------------------------------------
 
 Burr is not opinionated about the method of deployment/cloud one uses. Any 
method that can run python code, or web-service will work
-(AWS, vercel, etc...). Note we aim to have more examples here -- see `this 
issue <https://github.com/DAGWorks-Inc/burr/issues/390>`_ to track!
+(AWS, vercel, etc...). Note we aim to have more examples here -- see `this 
issue <https://github.com/apache/burr/issues/390>`_ to track!
 
-- `Deploying Burr in an AWS lambda function 
<https://github.com/DAGWorks-Inc/burr/tree/main/examples/deployment/aws/lambda>`_
-- `Deploying Burr using BentoML 
<https://github.com/DAGWorks-Inc/burr/tree/main/examples/deployment/aws/bentoml>`_
+- `Deploying Burr in an AWS lambda function 
<https://github.com/apache/burr/tree/main/examples/deployment/aws/lambda>`_
+- `Deploying Burr using BentoML 
<https://github.com/apache/burr/tree/main/examples/deployment/aws/bentoml>`_
 
 
 Using BentoML
 -------------
 `BentoML <https://github.com/bentoml/BentoML>`_ is a specialized tool to 
package, deploy, and manage AI services.
 For example, it allows you to create a REST API for your Burr application with 
minimal effort.
-See the `Burr + BentoML example 
<https://github.com/DAGWorks-Inc/burr/tree/main/examples/deployment/aws/bentoml>`_
 for more information.
+See the `Burr + BentoML example 
<https://github.com/apache/burr/tree/main/examples/deployment/aws/bentoml>`_ 
for more information.
diff --git a/docs/examples/deployment/monitoring.rst 
b/docs/examples/deployment/monitoring.rst
index 4138b6af..b1d27eaf 100644
--- a/docs/examples/deployment/monitoring.rst
+++ b/docs/examples/deployment/monitoring.rst
@@ -8,17 +8,17 @@ and has a suite of useful capabilities for debugging Burr 
applications.
 It has two (current) implementations:
 
 1. `Local (filesystem) tracking 
<https://burr.dagworks.io/concepts/tracking/>`_ (default, for debugging or 
lower-scale production use-cases with a distributed file-system)
-2. `S3-based tracking 
<https://github.com/DAGWorks-Inc/burr/blob/main/burr/tracking/server/s3/README.md>`_
 (meant for production use-cases)
+2. `S3-based tracking 
<https://github.com/apache/burr/blob/main/burr/tracking/server/s3/README.md>`_ 
(meant for production use-cases)
 
 Which each come with an implementation of data storage on the server.
 
 To deploy these in production, you can follow the following examples:
 
 1. `Burr + FastAPI + docker 
<https://github.com/mdrideout/burr-fastapi-docker-compose>`_ by `Matthew 
Rideout <https://github.com/mdrideout>`_. This contains a sample API + UI + 
tracking server all bundled in one!
-2. `Docker compose + nginx proxy 
<https://github.com/DAGWorks-Inc/burr/tree/main/examples/email-assistant#running-the-ui-with-email-server-backend-in-a-docker-container>`_
 by `Aditha Kaushik <https://github.com/97k>`_ for the email assistant example, 
demonstrates running the docker image with the tracking server.
+2. `Docker compose + nginx proxy 
<https://github.com/apache/burr/tree/main/examples/email-assistant#running-the-ui-with-email-server-backend-in-a-docker-container>`_
 by `Aditha Kaushik <https://github.com/97k>`_ for the email assistant example, 
demonstrates running the docker image with the tracking server.
 
 We also have a few issues to document deploying Burr's monitoring system in 
production:
 
-- `deploy on AWS <https://github.com/DAGWorks-Inc/burr/issues/391>`_
-- `deploy on GCP <https://github.com/DAGWorks-Inc/burr/issues/392>`_
-- `deploy on Azure <https://github.com/DAGWorks-Inc/burr/issues/393>`_
+- `deploy on AWS <https://github.com/apache/burr/issues/391>`_
+- `deploy on GCP <https://github.com/apache/burr/issues/392>`_
+- `deploy on Azure <https://github.com/apache/burr/issues/393>`_
diff --git a/docs/examples/deployment/web-server.rst 
b/docs/examples/deployment/web-server.rst
index dd07ec5f..2b0cf92b 100644
--- a/docs/examples/deployment/web-server.rst
+++ b/docs/examples/deployment/web-server.rst
@@ -7,13 +7,13 @@ We like `fastAPI <https://fastapi.tiangolo.com/>`_, but Burr 
can work with any p
 
 To run Burr in a FastAPI server, see the following examples:
 
-- `Human in the loop FastAPI server 
<https://github.com/DAGWorks-Inc/burr/tree/main/examples/web-server>`_ (`TDS 
blog post 
<https://towardsdatascience.com/building-an-email-assistant-application-with-burr-324bc34c547d>`__
 )
-- `OpenAI-compatible agent with FastAPI 
<https://github.com/DAGWorks-Inc/burr/tree/main/examples/openai-compatible-agent>`_
-- `Streaming server using SSE + FastAPI 
<https://github.com/DAGWorks-Inc/burr/tree/main/examples/streaming-fastapi>`_  
(`TDS blog post 
<https://towardsdatascience.com/how-to-build-a-streaming-agent-with-burr-fastapi-and-react-e2459ef527a8>`__
 )
-- `Use typed state with Pydantic + FastAPI 
<https://github.com/DAGWorks-Inc/burr/tree/main/examples/typed-state>`_
+- `Human in the loop FastAPI server 
<https://github.com/apache/burr/tree/main/examples/web-server>`_ (`TDS blog 
post 
<https://towardsdatascience.com/building-an-email-assistant-application-with-burr-324bc34c547d>`__
 )
+- `OpenAI-compatible agent with FastAPI 
<https://github.com/apache/burr/tree/main/examples/openai-compatible-agent>`_
+- `Streaming server using SSE + FastAPI 
<https://github.com/apache/burr/tree/main/examples/streaming-fastapi>`_  (`TDS 
blog post 
<https://towardsdatascience.com/how-to-build-a-streaming-agent-with-burr-fastapi-and-react-e2459ef527a8>`__
 )
+- `Use typed state with Pydantic + FastAPI 
<https://github.com/apache/burr/tree/main/examples/typed-state>`_
 - `Burr + FastAPI + docker 
<https://github.com/mdrideout/burr-fastapi-docker-compose>`_ by `Matthew 
Rideout <https://github.com/mdrideout>`_. This contains a sample web server API 
+ UI + tracking server all bundled in one!
-- `Docker compose + nginx proxy 
<https://github.com/DAGWorks-Inc/burr/tree/main/examples/email-assistant#running-the-ui-with-email-server-backend-in-a-docker-container>`_
 by `Aditha Kaushik <https://github.com/97k>`_ for the email assistant example, 
demonstrates running the docker image with the tracking server.
-- `BentoML + Burr 
<https://github.com/DAGWorks-Inc/burr/tree/main/examples/deployment/aws/bentoml>`_
 for deploying Burr with BentoML as a web-service.
+- `Docker compose + nginx proxy 
<https://github.com/apache/burr/tree/main/examples/email-assistant#running-the-ui-with-email-server-backend-in-a-docker-container>`_
 by `Aditha Kaushik <https://github.com/97k>`_ for the email assistant example, 
demonstrates running the docker image with the tracking server.
+- `BentoML + Burr 
<https://github.com/apache/burr/tree/main/examples/deployment/aws/bentoml>`_ 
for deploying Burr with BentoML as a web-service.
 
 Connecting to a database
 ------------------------
@@ -22,4 +22,4 @@ To connect Burr to a database, you can use one of the 
provided persisters, or bu
 
 - :ref:`Documentation on persistence <state-persistence>`
 - :ref:`Set of available persisters <persistersref>`
-- `Simple chatbot intro with persistence to SQLLite 
<https://github.com/DAGWorks-Inc/burr/blob/main/examples/simple-chatbot-intro/notebook.ipynb>`_
+- `Simple chatbot intro with persistence to SQLLite 
<https://github.com/apache/burr/blob/main/examples/simple-chatbot-intro/notebook.ipynb>`_
diff --git a/docs/examples/guardrails/creating_tests.rst 
b/docs/examples/guardrails/creating_tests.rst
index def1d89d..d88dfbdb 100644
--- a/docs/examples/guardrails/creating_tests.rst
+++ b/docs/examples/guardrails/creating_tests.rst
@@ -16,7 +16,7 @@ is what we're showing you how to do here.
 
 Need to know more about pytest?
 -------------------------------
-For a more pytest walkthrough and example, see the `pytest example 
<https://github.com/DAGWorks-Inc/burr/tree/main/examples/pytest>`_,
+For a more pytest walkthrough and example, see the `pytest example 
<https://github.com/apache/burr/tree/main/examples/pytest>`_,
 that explains what pytest is, how to evaluate more than just a single assert 
statement, how to aggregate results, etc.
 
 
@@ -54,7 +54,7 @@ Steps:
       --sequence-id 0 \
       --target-file-name /tmp/test-case.json
 
-See `github repository example 
<https://github.com/DAGWorks-Inc/burr/tree/main/examples/test-case-creation>`_
+See `github repository example 
<https://github.com/apache/burr/tree/main/examples/test-case-creation>`_
 for an example.
 
 Note (1): if you have custom serialization/deserialization logic, you will 
want to pass in `--serde-module` to the
diff --git a/docs/getting_started/simple-example.rst 
b/docs/getting_started/simple-example.rst
index c61e09f4..0badcb43 100644
--- a/docs/getting_started/simple-example.rst
+++ b/docs/getting_started/simple-example.rst
@@ -25,13 +25,13 @@ So hold tight! This gets you started with the basics but 
there's a lot more you
 
     This should take about 10 minutes to complete, and give you a good sense 
of the library basics.
     You'll need an OpenAI key set as the environment variable 
``OPENAI_API_KEY``. If you don't have one you can get one at `OpenAI 
<https://platform.openai.com>`_.
-    If you don't want to get one, check out the simple example of a `counter 
application 
<https://github.com/DAGWorks-Inc/burr/tree/main/examples/hello-world-counter>`_.
+    If you don't want to get one, check out the simple example of a `counter 
application 
<https://github.com/apache/burr/tree/main/examples/hello-world-counter>`_.
 
     If you want to skip ahead to the cool stuff (chatbots, ML training, 
simulations, etc...) feel free to jump into the deep end and start with the 
:ref:`examples <examples>`.
 
 πŸ€” If you prefer to learn by video, check out
 `this video walkthrough <https://www.youtube.com/watch?v=rEZ4oDN0GdU>`_
-using `this notebook 
<https://github.com/DAGWorks-Inc/burr/blob/main/examples/simple-chatbot-intro/notebook.ipynb>`_.
+using `this notebook 
<https://github.com/apache/burr/blob/main/examples/simple-chatbot-intro/notebook.ipynb>`_.
 
 ----------------------
 Build a Simple Chatbot
@@ -80,7 +80,7 @@ Before we proceed, let's note the following about how we 
define these actions:
 
 1. State is a dictionary -- actions declare input fields (as strings) and 
write values to those fields
 2. Actions use a specific *immutable* state object and call operations on it 
(``.append(...)``, ``.update(...)``)
-3. Functions can do whatever you want -- they can use plain python, or 
delegate to `langchain <https://langchain.com>`_, `hamilton 
<https://github.com/dagworks-inc/hamilton>`_, etc... All they have to do is 
return the new state.
+3. Functions can do whatever you want -- they can use plain python, or 
delegate to `langchain <https://langchain.com>`_, `hamilton 
<https://github.com/apache/hamilton>`_, etc... All they have to do is return 
the new state.
 4. We declare the parameter ``prompt``, meaning that we will expect the user 
to pass ``prompt`` every time they run the graph.
 
 .. note::
@@ -203,4 +203,4 @@ Now that we've built a basic application, we can do the 
following with only a fe
 2. :ref:`Persist state to a database + reload <state-persistence>` -- add a 
``initialize_from`` line to the builder and select a pre-existing/implement a 
custom persistence method.
 3. :ref:`Add monitoring to track application data <tracking>` -- leverage 
``with_tracker`` to track to the Burr UI and visualize your application live.
 4. :ref:`Stream results back <streaming>` -- minimize time to first token by 
streaming results back to the user.
-5. `Generate test cases from prior runs 
<https://github.com/DAGWorks-Inc/burr/tree/main/examples/test-case-creation>`_ 
-- use the ``burr-testburr-test-case create`` command to automatically generate 
test cases for your LLM app.
+5. `Generate test cases from prior runs 
<https://github.com/apache/burr/tree/main/examples/test-case-creation>`_ -- use 
the ``burr-testburr-test-case create`` command to automatically generate test 
cases for your LLM app.
diff --git a/docs/main.rst b/docs/main.rst
index 36e5cb56..5de79c4f 100644
--- a/docs/main.rst
+++ b/docs/main.rst
@@ -6,7 +6,7 @@ Welcome to Apache Burr's (incubating) documentation.
 
 For a quick overview of Burr, watch `this walkthrough 
<https://www.loom.com/share/a10f163428b942fea55db1a84b1140d8?sid=1512863b-f533-4a42-a2f3-95b13deb07c9>`_
 or read `our blog post 
<https://blog.dagworks.io/p/burr-develop-stateful-ai-applications?r=2cg5z1&utm_campaign=post&utm_medium=web>`_.
 The following video is
-a longer demo of building a simple chatbot application with Burr using `this 
notebook 
<https://github.com/DAGWorks-Inc/burr/blob/main/examples/conversational-rag/simple_example/notebook.ipynb>`_:
+a longer demo of building a simple chatbot application with Burr using `this 
notebook 
<https://github.com/apache/burr/blob/main/examples/conversational-rag/simple_example/notebook.ipynb>`_:
 
 .. raw:: html
 
@@ -22,8 +22,8 @@ You'll find this documentation separated into three sections.
 
 We also ask that you:
 
-- Report any bugs, issues, or feature requests via `GitHub Issues 
<https://github.com/DAGWorks-Inc/burr/issues>`_ or \
-  `GitHub Discussions <https://github.com/DAGWorks-Inc/burr/discussions>`_.
+- Report any bugs, issues, or feature requests via `GitHub Issues 
<https://github.com/apache/burr/issues>`_ or \
+  `GitHub Discussions <https://github.com/apache/burr/discussions>`_.
 - Give us a star on `GitHub <https://github.com/apache/burr>`_ if you like the 
project!
 
 
diff --git a/docs/reference/integrations/langchain.rst 
b/docs/reference/integrations/langchain.rst
index 4b6436d0..78cdf572 100644
--- a/docs/reference/integrations/langchain.rst
+++ b/docs/reference/integrations/langchain.rst
@@ -6,13 +6,13 @@ Burr works out of the box with langchain, as Burr delegates 
to any python code.
 
 There are multiple examples of Burr leveraging langchain, including:
 
-- `Multi agent collaboration 
<https://github.com/DAGWorks-Inc/burr/tree/main/examples/multi-agent-collaboration/lcel>`_
-- `LCEL + Hamilton together 
<https://github.com/DAGWorks-Inc/burr/tree/main/examples/multi-agent-collaboration/hamilton>`_
+- `Multi agent collaboration 
<https://github.com/apache/burr/tree/main/examples/multi-agent-collaboration/lcel>`_
+- `LCEL + Hamilton together 
<https://github.com/apache/burr/tree/main/examples/multi-agent-collaboration/hamilton>`_
 
 Burr also provides custom ser/deserialization for langchain objects. See the 
following resources:
-1. `Example 
<https://github.com/DAGWorks-Inc/burr/tree/main/examples/custom-serde>`_
+1. `Example <https://github.com/apache/burr/tree/main/examples/custom-serde>`_
 2. :ref:`Custom serialization docs <serde>`
-3. `Langchain serialization plugin 
<https://github.com/DAGWorks-Inc/burr/blob/main/burr/integrations/serde/langchain.py>`_
+3. `Langchain serialization plugin 
<https://github.com/apache/burr/blob/main/burr/integrations/serde/langchain.py>`_
 
 We are working on adding more builtin support for LCEL (LCELActions), and 
considering adding burr callbacks for tracing langgraph in the Burr
 UI. If you have any suggestions, please let us know.
diff --git a/examples/conversational-rag/simple_example/README.md 
b/examples/conversational-rag/simple_example/README.md
index c6068b42..07e53753 100644
--- a/examples/conversational-rag/simple_example/README.md
+++ b/examples/conversational-rag/simple_example/README.md
@@ -37,7 +37,7 @@ You'll then have a text terminal where you can interact. Type 
exit to stop.
 ![Application Image](statemachine.png)
 
 # Video Walkthrough via Notebook
-Open the notebook <a target="_blank" 
href="https://colab.research.google.com/github/dagworks-inc/burr/blob/main/examples/conversational-rag/simple_example/notebook.ipynb";>
+Open the notebook <a target="_blank" 
href="https://colab.research.google.com/github/apache/burr/blob/main/examples/conversational-rag/simple_example/notebook.ipynb";>
   <img src="https://colab.research.google.com/assets/colab-badge.svg"; 
alt="Open In Colab"/>
 </a>
 
diff --git a/examples/custom-serde/README.md b/examples/custom-serde/README.md
index 09463795..bff195a8 100644
--- a/examples/custom-serde/README.md
+++ b/examples/custom-serde/README.md
@@ -19,7 +19,7 @@ pip install jupyter
 jupyter notebook
 ``
 
-and running the notebook. Or <a target="_blank" 
href="https://colab.research.google.com/github/dagworks-inc/burr/blob/main/examples/custom-serde/notebook.ipynb";>
+and running the notebook. Or <a target="_blank" 
href="https://colab.research.google.com/github/apache/burr/blob/main/examples/custom-serde/notebook.ipynb";>
   <img src="https://colab.research.google.com/assets/colab-badge.svg"; 
alt="Open In Colab"/>
 </a>.
 
diff --git a/examples/email-assistant/README.md 
b/examples/email-assistant/README.md
index 723b5804..fc49cbf3 100644
--- a/examples/email-assistant/README.md
+++ b/examples/email-assistant/README.md
@@ -29,7 +29,7 @@ Note we will be adding two things to this demo:
 1. An integration with the burr web app
 2. a standalone server example with a walkthrough
 
-Open the notebook <a target="_blank" 
href="https://colab.research.google.com/github/dagworks-inc/burr/blob/main/examples/email-assistant/notebook.ipynb";>
+Open the notebook <a target="_blank" 
href="https://colab.research.google.com/github/apache/burr/blob/main/examples/email-assistant/notebook.ipynb";>
   <img src="https://colab.research.google.com/assets/colab-badge.svg"; 
alt="Open In Colab"/>
 </a>
 
diff --git a/examples/hamilton-integration/notebook.ipynb 
b/examples/hamilton-integration/notebook.ipynb
index f7f02164..b7d257f4 100644
--- a/examples/hamilton-integration/notebook.ipynb
+++ b/examples/hamilton-integration/notebook.ipynb
@@ -24,7 +24,7 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "# Use the 2-layer approach for a maintainable RAG system [![Open In 
Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/dagworks-inc/burr/blob/main/examples/hamilton-integration/notebook.ipynb)
 [![GitHub 
badge](https://img.shields.io/badge/github-view_source-2b3137?logo=github)](https://github.com/dagworks-inc/burr/blob/main/examples/hamilton-integration/notebook.ipynb)\n",
+    "# Use the 2-layer approach for a maintainable RAG system [![Open In 
Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/apache/burr/blob/main/examples/hamilton-integration/notebook.ipynb)
 [![GitHub 
badge](https://img.shields.io/badge/github-view_source-2b3137?logo=github)](https://github.com/apache/burr/blob/main/examples/hamilton-integration/notebook.ipynb)\n",
     "\n",
     "Ready-made solutions can get you started with GenAI, but building 
reliable product features with retrieval augmented generation (RAG) and LLM 
agents inevitably required custom code. This post shares the 2-layer approach 
to build a maintainable RAG application that will evolve with your needs. To 
illustrate these ideas, we will show how a typical RAG project might evolve.\n",
     "\n",
diff --git a/examples/hello-world-counter/README.md 
b/examples/hello-world-counter/README.md
index 89d4a258..0338b50d 100644
--- a/examples/hello-world-counter/README.md
+++ b/examples/hello-world-counter/README.md
@@ -7,7 +7,7 @@ We have three files:
 - [application.py](application.py) -- This contains a mainline to run the 
counter as well as a function to export the counter (for later use)
 - [requirements.txt](requirements.txt) -- Just the requirements. All this 
needs is Burr/Streamlit
 - [streamlit_app.py](streamlit_app.py) -- This contains a simple Streamlit app 
to interact with the counter.
-- [notebook.ipynb](notebook.ipynb) -- A notebook that shows the counter app 
too. Open the notebook <a target="_blank" 
href="https://colab.research.google.com/github/dagworks-inc/burr/blob/main/examples/hello-world-counter/notebook.ipynb";>
+- [notebook.ipynb](notebook.ipynb) -- A notebook that shows the counter app 
too. Open the notebook <a target="_blank" 
href="https://colab.research.google.com/github/apache/burr/blob/main/examples/hello-world-counter/notebook.ipynb";>
   <img src="https://colab.research.google.com/assets/colab-badge.svg"; 
alt="Open In Colab"/>
 </a>
 
diff --git a/examples/image-telephone/README.md 
b/examples/image-telephone/README.md
index 54938054..cefa793f 100644
--- a/examples/image-telephone/README.md
+++ b/examples/image-telephone/README.md
@@ -25,7 +25,7 @@ We recommend starting with the notebook.
 
 ### notebook.ipynb
 You can use [notebook.ipynb](./notebook.ipynb) to run things. Or
-<a target="_blank" 
href="https://colab.research.google.com/github/DAGWorks-Inc/burr/blob/main/examples/image-telephone/notebook.ipynb";>
+<a target="_blank" 
href="https://colab.research.google.com/github/apache/burr/blob/main/examples/image-telephone/notebook.ipynb";>
   <img src="https://colab.research.google.com/assets/colab-badge.svg"; 
alt="Open In Colab"/>
 </a>
 
diff --git a/examples/llm-adventure-game/README.md 
b/examples/llm-adventure-game/README.md
index d2e00f30..a9a63680 100644
--- a/examples/llm-adventure-game/README.md
+++ b/examples/llm-adventure-game/README.md
@@ -7,7 +7,7 @@ How to run:
 OPENAI_API_KEY=<your key> python application.py
 ```
 
-Open the notebook <a target="_blank" 
href="https://colab.research.google.com/github/dagworks-inc/burr/blob/main/examples/llm-adventure-game/notebook.ipynb";>
+Open the notebook <a target="_blank" 
href="https://colab.research.google.com/github/apache/burr/blob/main/examples/llm-adventure-game/notebook.ipynb";>
   <img src="https://colab.research.google.com/assets/colab-badge.svg"; 
alt="Open In Colab"/>
 </a>
 
diff --git a/examples/ml-training/README.md b/examples/ml-training/README.md
index ff588aa2..758f6115 100644
--- a/examples/ml-training/README.md
+++ b/examples/ml-training/README.md
@@ -1,6 +1,6 @@
 # ML Training
 
-This is a WIP! Please contribute back if you have ideas. You can track the 
associated issue [here](https://github.com/DAGWorks-Inc/burr/issues/138).
+This is a WIP! Please contribute back if you have ideas. You can track the 
associated issue [here](https://github.com/apache/burr/issues/138).
 
 A machine learning training system can easily be modeled as a state machine.
 
diff --git a/examples/multi-agent-collaboration/README.md 
b/examples/multi-agent-collaboration/README.md
index 59739ac3..a2e06aec 100644
--- a/examples/multi-agent-collaboration/README.md
+++ b/examples/multi-agent-collaboration/README.md
@@ -45,7 +45,7 @@ export TAVILY_API_KEY=YOUR_KEY
 To run the example, you can do:
 
 Run the notebook:
-<a target="_blank" 
href="https://colab.research.google.com/github/dagworks-inc/burr/blob/main/examples/multi-agent-collaboration/hamilton/notebook.ipynb";>
+<a target="_blank" 
href="https://colab.research.google.com/github/apache/burr/blob/main/examples/multi-agent-collaboration/hamilton/notebook.ipynb";>
   <img src="https://colab.research.google.com/assets/colab-badge.svg"; 
alt="Open In Colab"/>
 </a>
 
@@ -58,7 +58,7 @@ Application run:
 or
 
 Run the notebook:
-<a target="_blank" 
href="https://colab.research.google.com/github/dagworks-inc/burr/blob/main/examples/multi-agent-collaboration/lcel/notebook.ipynb";>
+<a target="_blank" 
href="https://colab.research.google.com/github/apache/burr/blob/main/examples/multi-agent-collaboration/lcel/notebook.ipynb";>
   <img src="https://colab.research.google.com/assets/colab-badge.svg"; 
alt="Open In Colab"/>
 </a>
 
diff --git a/examples/multi-agent-collaboration/hamilton/README.md 
b/examples/multi-agent-collaboration/hamilton/README.md
index 66895eac..65c10d4f 100644
--- a/examples/multi-agent-collaboration/hamilton/README.md
+++ b/examples/multi-agent-collaboration/hamilton/README.md
@@ -8,7 +8,7 @@ With Hamilton the prompts can be found in the module 
[`func_agent.py`](func_agen
 
 The Hamilton code creates the following dataflow:
 
-![dataflow](https://github.com/DAGWorks-Inc/burr/assets/2328071/24822ee5-f05b-4fa4-95e7-daa23969cfff)
+![dataflow](https://github.com/apache/burr/assets/2328071/24822ee5-f05b-4fa4-95e7-daa23969cfff)
 
 
 # Tracing
@@ -33,7 +33,7 @@ export TAVILY_API_KEY=YOUR_KEY
 ```
 
 Run the notebook:
-<a target="_blank" 
href="https://colab.research.google.com/github/dagworks-inc/burr/blob/main/examples/multi-agent-collaboration/hamilton/notebook.ipynb";>
+<a target="_blank" 
href="https://colab.research.google.com/github/apache/burr/blob/main/examples/multi-agent-collaboration/hamilton/notebook.ipynb";>
   <img src="https://colab.research.google.com/assets/colab-badge.svg"; 
alt="Open In Colab"/>
 </a>
 or do it manually:
diff --git a/examples/multi-agent-collaboration/lcel/README.md 
b/examples/multi-agent-collaboration/lcel/README.md
index ae3fa5af..747d3f99 100644
--- a/examples/multi-agent-collaboration/lcel/README.md
+++ b/examples/multi-agent-collaboration/lcel/README.md
@@ -25,7 +25,7 @@ export TAVILY_API_KEY=YOUR_KEY
 ```
 
 Run the notebook:
-<a target="_blank" 
href="https://colab.research.google.com/github/dagworks-inc/burr/blob/main/examples/multi-agent-collaboration/lcel/notebook.ipynb";>
+<a target="_blank" 
href="https://colab.research.google.com/github/apache/burr/blob/main/examples/multi-agent-collaboration/lcel/notebook.ipynb";>
   <img src="https://colab.research.google.com/assets/colab-badge.svg"; 
alt="Open In Colab"/>
 </a>
 or do it manually:
diff --git a/examples/multi-modal-chatbot/README.md 
b/examples/multi-modal-chatbot/README.md
index e25822e9..08177094 100644
--- a/examples/multi-modal-chatbot/README.md
+++ b/examples/multi-modal-chatbot/README.md
@@ -26,7 +26,7 @@ We have a few files:
 - [application.py](application.py) -- This contains a mainline to generate the 
graph portrayal.
 - [requirements.txt](requirements.txt) -- Just the requirements. All this 
needs is Burr/Streamlit/openai
 - [simple_streamlit_app.py](simple_streamlit_app.py) -- This contains a more 
sophisticated Streamlit app to interact with.
-- [notebook.ipynb](notebook.ipynb) -- A notebook that helps exercise things. 
<a target="_blank" 
href="https://colab.research.google.com/github/dagworks-inc/burr/blob/main/examples/multi-modal-chatbot/notebook.ipynb";>
+- [notebook.ipynb](notebook.ipynb) -- A notebook that helps exercise things. 
<a target="_blank" 
href="https://colab.research.google.com/github/apache/burr/blob/main/examples/multi-modal-chatbot/notebook.ipynb";>
   <img src="https://colab.research.google.com/assets/colab-badge.svg"; 
alt="Open In Colab"/>
 </a>
 - [streamlit_app.py](streamlit_app.py) -- This contains a simple Streamlit app 
to interact with that is more
diff --git a/examples/multi-modal-chatbot/burr_demo.ipynb 
b/examples/multi-modal-chatbot/burr_demo.ipynb
index cf916962..83222b30 100644
--- a/examples/multi-modal-chatbot/burr_demo.ipynb
+++ b/examples/multi-modal-chatbot/burr_demo.ipynb
@@ -12,7 +12,7 @@
     "<img 
src=\"https://github.com/user-attachments/assets/2ab9b499-7ca2-4ae9-af72-ccc775f30b4e\";
 width=\"100\" align=\"left\" /> + \n",
     "<img src=\"https://cdn.mos.cms.futurecdn.net/VgGxJABA8DcfAMpPPwdv6a.jpg\"; 
width=\"200\" align=\"center\"/>\n",
     "\n",
-    
"[https://github.com/dagworks-inc/burr](https://github.com/dagworks-inc/burr) 
by DAGWorks Inc. (YCW23 & StartX).\n",
+    "[https://github.com/apache/burr](https://github.com/apache/burr) by 
DAGWorks Inc. (YCW23 & StartX).\n",
     "\n",
     "Take🏠:\n",
     "\n",
@@ -698,21 +698,21 @@
      "evalue": "Demo error",
      "output_type": "error",
      "traceback": [
-      
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
-      "\u001b[0;31mValueError\u001b[0m                                
Traceback (most recent call last)",
-      "Cell \u001b[0;32mIn[4], line 5\u001b[0m\n\u001b[1;32m      3\u001b[0m 
\u001b[38;5;28;01mif\u001b[39;00m 
\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mquit\u001b[39m\u001b[38;5;124m\"\u001b[39m
 \u001b[38;5;241m==\u001b[39m 
user_input\u001b[38;5;241m.\u001b[39mlower():\n\u001b[1;32m      4\u001b[0m     
\u001b[38;5;28;01mbreak\u001b[39;00m\n\u001b[0;32m----> 5\u001b[0m last_action, 
action_result, app_state \u001b[38;5;241m=\u001b[39m 
\u001b[43mapp\u001b[49m\u001b[38;5;241;43m.\u001b[39 [...]
-      "File \u001b[0;32m~/dagworks/burr/burr/telemetry.py:276\u001b[0m, in 
\u001b[0;36mcapture_function_usage.<locals>.wrapped_fn\u001b[0;34m(*args, 
**kwargs)\u001b[0m\n\u001b[1;32m    273\u001b[0m 
\u001b[38;5;129m@functools\u001b[39m\u001b[38;5;241m.\u001b[39mwraps(call_fn)\n\u001b[1;32m
    274\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m 
\u001b[38;5;21mwrapped_fn\u001b[39m(\u001b[38;5;241m*\u001b[39margs, 
\u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs):\n\u001b[1;32m    
27 [...]
-      "File \u001b[0;32m~/dagworks/burr/burr/core/application.py:616\u001b[0m, 
in 
\u001b[0;36m_call_execute_method_pre_post.__call__.<locals>.wrapper_sync\u001b[0;34m(app_self,
 *args, **kwargs)\u001b[0m\n\u001b[1;32m    614\u001b[0m exc 
\u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mNone\u001b[39;00m\n\u001b[1;32m   
 615\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 616\u001b[0m 
    \u001b[38;5;28;01mreturn\u001b[39;00m 
\u001b[43mfn\u001b[49m\u001b[43m(\u001b[49m\u001b[43m [...]
-      "File 
\u001b[0;32m~/dagworks/burr/burr/core/application.py:1168\u001b[0m, in 
\u001b[0;36mApplication.run\u001b[0;34m(self, halt_before, halt_after, 
inputs)\u001b[0m\n\u001b[1;32m   1166\u001b[0m 
\u001b[38;5;28;01mwhile\u001b[39;00m 
\u001b[38;5;28;01mTrue\u001b[39;00m:\n\u001b[1;32m   1167\u001b[0m     
\u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m-> 1168\u001b[0m         
\u001b[38;5;28;43mnext\u001b[39;49m\u001b[43m(\u001b[49m\u001b[43mgen\u001b[49m\u001b[43m)\u001b[49m\n\u001b[
 [...]
-      "File 
\u001b[0;32m~/dagworks/burr/burr/core/application.py:1111\u001b[0m, in 
\u001b[0;36mApplication.iterate\u001b[0;34m(self, halt_before, halt_after, 
inputs)\u001b[0m\n\u001b[1;32m   1108\u001b[0m prior_action: Optional[Action] 
\u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mNone\u001b[39;00m\n\u001b[1;32m   
1109\u001b[0m \u001b[38;5;28;01mwhile\u001b[39;00m 
\u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mhas_next_action():\n\u001b[1;32m
   1110\u001b[0m     \u001b[38;5;66;0 [...]
-      "File \u001b[0;32m~/dagworks/burr/burr/core/application.py:616\u001b[0m, 
in 
\u001b[0;36m_call_execute_method_pre_post.__call__.<locals>.wrapper_sync\u001b[0;34m(app_self,
 *args, **kwargs)\u001b[0m\n\u001b[1;32m    614\u001b[0m exc 
\u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mNone\u001b[39;00m\n\u001b[1;32m   
 615\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 616\u001b[0m 
    \u001b[38;5;28;01mreturn\u001b[39;00m 
\u001b[43mfn\u001b[49m\u001b[43m(\u001b[49m\u001b[43m [...]
-      "File \u001b[0;32m~/dagworks/burr/burr/core/application.py:773\u001b[0m, 
in \u001b[0;36mApplication.step\u001b[0;34m(self, 
inputs)\u001b[0m\n\u001b[1;32m    770\u001b[0m \u001b[38;5;66;03m# we need to 
increment the sequence before we start computing\u001b[39;00m\n\u001b[1;32m    
771\u001b[0m \u001b[38;5;66;03m# that way if we're replaying from state, we 
don't get stuck\u001b[39;00m\n\u001b[1;32m    772\u001b[0m 
\u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_increment_sequ [...]
-      "File \u001b[0;32m~/dagworks/burr/burr/core/application.py:826\u001b[0m, 
in \u001b[0;36mApplication._step\u001b[0;34m(self, inputs, 
_run_hooks)\u001b[0m\n\u001b[1;32m    824\u001b[0m     exc 
\u001b[38;5;241m=\u001b[39m e\n\u001b[1;32m    825\u001b[0m     
logger\u001b[38;5;241m.\u001b[39mexception(_format_BASE_ERROR_MESSAGE(next_action,
 \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_state, 
inputs))\n\u001b[0;32m--> 826\u001b[0m     \u001b[38;5;28;01mraise\u001b[39;00m 
e\n\ [...]
-      "File \u001b[0;32m~/dagworks/burr/burr/core/application.py:812\u001b[0m, 
in \u001b[0;36mApplication._step\u001b[0;34m(self, inputs, 
_run_hooks)\u001b[0m\n\u001b[1;32m    810\u001b[0m 
\u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m    811\u001b[0m     
\u001b[38;5;28;01mif\u001b[39;00m 
next_action\u001b[38;5;241m.\u001b[39msingle_step:\n\u001b[0;32m--> 
812\u001b[0m         result, new_state \u001b[38;5;241m=\u001b[39m 
\u001b[43m_run_single_step_action\u001b[49m\u001b[43m(\u001b[49m [...]
-      "File \u001b[0;32m~/dagworks/burr/burr/core/application.py:252\u001b[0m, 
in \u001b[0;36m_run_single_step_action\u001b[0;34m(action, state, 
inputs)\u001b[0m\n\u001b[1;32m    249\u001b[0m \u001b[38;5;66;03m# TODO -- 
guard all reads/writes with a subset of the state\u001b[39;00m\n\u001b[1;32m    
250\u001b[0m 
action\u001b[38;5;241m.\u001b[39mvalidate_inputs(inputs)\n\u001b[1;32m    
251\u001b[0m result, new_state \u001b[38;5;241m=\u001b[39m 
_adjust_single_step_output(\n\u001b[0;32m--> 2 [...]
-      "File \u001b[0;32m~/dagworks/burr/burr/core/action.py:639\u001b[0m, in 
\u001b[0;36mFunctionBasedAction.run_and_update\u001b[0;34m(self, state, 
**run_kwargs)\u001b[0m\n\u001b[1;32m    638\u001b[0m 
\u001b[38;5;28;01mdef\u001b[39;00m 
\u001b[38;5;21mrun_and_update\u001b[39m(\u001b[38;5;28mself\u001b[39m, state: 
State, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mrun_kwargs) 
\u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m 
\u001b[38;5;28mtuple\u001b[39m[\u001b[38;5;28mdic [...]
-      "Cell \u001b[0;32mIn[1], line 94\u001b[0m, in 
\u001b[0;36mimage_response\u001b[0;34m(state, model)\u001b[0m\n\u001b[1;32m     
91\u001b[0m 
\u001b[38;5;129m@action\u001b[39m(reads\u001b[38;5;241m=\u001b[39m[\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mprompt\u001b[39m\u001b[38;5;124m\"\u001b[39m,
 
\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mchat_history\u001b[39m\u001b[38;5;124m\"\u001b[39m,
 
\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mmode\u001b[39m\u001b[38;5;124m\"\u001b[39m],
 writes [...]
-      "\u001b[0;31mValueError\u001b[0m: Demo error"
+      
"\u001B[0;31m---------------------------------------------------------------------------\u001B[0m",
+      "\u001B[0;31mValueError\u001B[0m                                
Traceback (most recent call last)",
+      "Cell \u001B[0;32mIn[4], line 5\u001B[0m\n\u001B[1;32m      3\u001B[0m 
\u001B[38;5;28;01mif\u001B[39;00m 
\u001B[38;5;124m\"\u001B[39m\u001B[38;5;124mquit\u001B[39m\u001B[38;5;124m\"\u001B[39m
 \u001B[38;5;241m==\u001B[39m 
user_input\u001B[38;5;241m.\u001B[39mlower():\n\u001B[1;32m      4\u001B[0m     
\u001B[38;5;28;01mbreak\u001B[39;00m\n\u001B[0;32m----> 5\u001B[0m last_action, 
action_result, app_state \u001B[38;5;241m=\u001B[39m 
\u001B[43mapp\u001B[49m\u001B[38;5;241;43m.\u001B[39 [...]
+      "File \u001B[0;32m~/dagworks/burr/burr/telemetry.py:276\u001B[0m, in 
\u001B[0;36mcapture_function_usage.<locals>.wrapped_fn\u001B[0;34m(*args, 
**kwargs)\u001B[0m\n\u001B[1;32m    273\u001B[0m 
\u001B[38;5;129m@functools\u001B[39m\u001B[38;5;241m.\u001B[39mwraps(call_fn)\n\u001B[1;32m
    274\u001B[0m \u001B[38;5;28;01mdef\u001B[39;00m 
\u001B[38;5;21mwrapped_fn\u001B[39m(\u001B[38;5;241m*\u001B[39margs, 
\u001B[38;5;241m*\u001B[39m\u001B[38;5;241m*\u001B[39mkwargs):\n\u001B[1;32m    
27 [...]
+      "File \u001B[0;32m~/dagworks/burr/burr/core/application.py:616\u001B[0m, 
in 
\u001B[0;36m_call_execute_method_pre_post.__call__.<locals>.wrapper_sync\u001B[0;34m(app_self,
 *args, **kwargs)\u001B[0m\n\u001B[1;32m    614\u001B[0m exc 
\u001B[38;5;241m=\u001B[39m \u001B[38;5;28;01mNone\u001B[39;00m\n\u001B[1;32m   
 615\u001B[0m \u001B[38;5;28;01mtry\u001B[39;00m:\n\u001B[0;32m--> 616\u001B[0m 
    \u001B[38;5;28;01mreturn\u001B[39;00m 
\u001B[43mfn\u001B[49m\u001B[43m(\u001B[49m\u001B[43m [...]
+      "File 
\u001B[0;32m~/dagworks/burr/burr/core/application.py:1168\u001B[0m, in 
\u001B[0;36mApplication.run\u001B[0;34m(self, halt_before, halt_after, 
inputs)\u001B[0m\n\u001B[1;32m   1166\u001B[0m 
\u001B[38;5;28;01mwhile\u001B[39;00m 
\u001B[38;5;28;01mTrue\u001B[39;00m:\n\u001B[1;32m   1167\u001B[0m     
\u001B[38;5;28;01mtry\u001B[39;00m:\n\u001B[0;32m-> 1168\u001B[0m         
\u001B[38;5;28;43mnext\u001B[39;49m\u001B[43m(\u001B[49m\u001B[43mgen\u001B[49m\u001B[43m)\u001B[49m\n\u001B[
 [...]
+      "File 
\u001B[0;32m~/dagworks/burr/burr/core/application.py:1111\u001B[0m, in 
\u001B[0;36mApplication.iterate\u001B[0;34m(self, halt_before, halt_after, 
inputs)\u001B[0m\n\u001B[1;32m   1108\u001B[0m prior_action: Optional[Action] 
\u001B[38;5;241m=\u001B[39m \u001B[38;5;28;01mNone\u001B[39;00m\n\u001B[1;32m   
1109\u001B[0m \u001B[38;5;28;01mwhile\u001B[39;00m 
\u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39mhas_next_action():\n\u001B[1;32m
   1110\u001B[0m     \u001B[38;5;66;0 [...]
+      "File \u001B[0;32m~/dagworks/burr/burr/core/application.py:616\u001B[0m, 
in 
\u001B[0;36m_call_execute_method_pre_post.__call__.<locals>.wrapper_sync\u001B[0;34m(app_self,
 *args, **kwargs)\u001B[0m\n\u001B[1;32m    614\u001B[0m exc 
\u001B[38;5;241m=\u001B[39m \u001B[38;5;28;01mNone\u001B[39;00m\n\u001B[1;32m   
 615\u001B[0m \u001B[38;5;28;01mtry\u001B[39;00m:\n\u001B[0;32m--> 616\u001B[0m 
    \u001B[38;5;28;01mreturn\u001B[39;00m 
\u001B[43mfn\u001B[49m\u001B[43m(\u001B[49m\u001B[43m [...]
+      "File \u001B[0;32m~/dagworks/burr/burr/core/application.py:773\u001B[0m, 
in \u001B[0;36mApplication.step\u001B[0;34m(self, 
inputs)\u001B[0m\n\u001B[1;32m    770\u001B[0m \u001B[38;5;66;03m# we need to 
increment the sequence before we start computing\u001B[39;00m\n\u001B[1;32m    
771\u001B[0m \u001B[38;5;66;03m# that way if we're replaying from state, we 
don't get stuck\u001B[39;00m\n\u001B[1;32m    772\u001B[0m 
\u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39m_increment_sequ [...]
+      "File \u001B[0;32m~/dagworks/burr/burr/core/application.py:826\u001B[0m, 
in \u001B[0;36mApplication._step\u001B[0;34m(self, inputs, 
_run_hooks)\u001B[0m\n\u001B[1;32m    824\u001B[0m     exc 
\u001B[38;5;241m=\u001B[39m e\n\u001B[1;32m    825\u001B[0m     
logger\u001B[38;5;241m.\u001B[39mexception(_format_BASE_ERROR_MESSAGE(next_action,
 \u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39m_state, 
inputs))\n\u001B[0;32m--> 826\u001B[0m     \u001B[38;5;28;01mraise\u001B[39;00m 
e\n\ [...]
+      "File \u001B[0;32m~/dagworks/burr/burr/core/application.py:812\u001B[0m, 
in \u001B[0;36mApplication._step\u001B[0;34m(self, inputs, 
_run_hooks)\u001B[0m\n\u001B[1;32m    810\u001B[0m 
\u001B[38;5;28;01mtry\u001B[39;00m:\n\u001B[1;32m    811\u001B[0m     
\u001B[38;5;28;01mif\u001B[39;00m 
next_action\u001B[38;5;241m.\u001B[39msingle_step:\n\u001B[0;32m--> 
812\u001B[0m         result, new_state \u001B[38;5;241m=\u001B[39m 
\u001B[43m_run_single_step_action\u001B[49m\u001B[43m(\u001B[49m [...]
+      "File \u001B[0;32m~/dagworks/burr/burr/core/application.py:252\u001B[0m, 
in \u001B[0;36m_run_single_step_action\u001B[0;34m(action, state, 
inputs)\u001B[0m\n\u001B[1;32m    249\u001B[0m \u001B[38;5;66;03m# TODO -- 
guard all reads/writes with a subset of the state\u001B[39;00m\n\u001B[1;32m    
250\u001B[0m 
action\u001B[38;5;241m.\u001B[39mvalidate_inputs(inputs)\n\u001B[1;32m    
251\u001B[0m result, new_state \u001B[38;5;241m=\u001B[39m 
_adjust_single_step_output(\n\u001B[0;32m--> 2 [...]
+      "File \u001B[0;32m~/dagworks/burr/burr/core/action.py:639\u001B[0m, in 
\u001B[0;36mFunctionBasedAction.run_and_update\u001B[0;34m(self, state, 
**run_kwargs)\u001B[0m\n\u001B[1;32m    638\u001B[0m 
\u001B[38;5;28;01mdef\u001B[39;00m 
\u001B[38;5;21mrun_and_update\u001B[39m(\u001B[38;5;28mself\u001B[39m, state: 
State, \u001B[38;5;241m*\u001B[39m\u001B[38;5;241m*\u001B[39mrun_kwargs) 
\u001B[38;5;241m-\u001B[39m\u001B[38;5;241m>\u001B[39m 
\u001B[38;5;28mtuple\u001B[39m[\u001B[38;5;28mdic [...]
+      "Cell \u001B[0;32mIn[1], line 94\u001B[0m, in 
\u001B[0;36mimage_response\u001B[0;34m(state, model)\u001B[0m\n\u001B[1;32m     
91\u001B[0m 
\u001B[38;5;129m@action\u001B[39m(reads\u001B[38;5;241m=\u001B[39m[\u001B[38;5;124m\"\u001B[39m\u001B[38;5;124mprompt\u001B[39m\u001B[38;5;124m\"\u001B[39m,
 
\u001B[38;5;124m\"\u001B[39m\u001B[38;5;124mchat_history\u001B[39m\u001B[38;5;124m\"\u001B[39m,
 
\u001B[38;5;124m\"\u001B[39m\u001B[38;5;124mmode\u001B[39m\u001B[38;5;124m\"\u001B[39m],
 writes [...]
+      "\u001B[0;31mValueError\u001B[0m: Demo error"
      ]
     }
    ],
@@ -994,7 +994,7 @@
     "\n",
     "[Link to video walking through this 
notebook](https://youtu.be/hqutVJyd3TI).\n",
     "\n",
-    
"[https://github.com/dagworks-inc/burr](https://github.com/dagworks-inc/burr)\n",
+    "[https://github.com/apache/burr](https://github.com/apache/burr)\n",
     "<img src=\"burr_qrcode.png\" width=\"125\"/>\n",
     "\n",
     "[Time Travel blog post & 
video:](https://blog.dagworks.io/p/travel-back-in-time-with-burr)\n",
@@ -1005,10 +1005,10 @@
     "\n",
     "More blogs @ `blog.dagworks.io` e.g. [async & 
streaming](https://blog.dagworks.io/p/streaming-chatbot-with-burr-fastapi)\n",
     "\n",
-    "More 
[examples](https://github.com/DAGWorks-Inc/burr/tree/main/examples/):\n",
+    "More [examples](https://github.com/apache/burr/tree/main/examples/):\n",
     "\n",
     "- e.g. [test case 
creation](https://burr.dagworks.io/examples/guardrails/creating_tests/)\n",
-    "- e.g. [multi-agent 
collaboration](https://github.com/DAGWorks-Inc/burr/tree/main/examples/multi-agent-collaboration)\n",
+    "- e.g. [multi-agent 
collaboration](https://github.com/apache/burr/tree/main/examples/multi-agent-collaboration)\n",
     "\n",
     "Follow on Twitter & LinkedIn:\n",
     "\n",
diff --git a/examples/other-examples/cowsay/README.md 
b/examples/other-examples/cowsay/README.md
index f9fac261..dcb007de 100644
--- a/examples/other-examples/cowsay/README.md
+++ b/examples/other-examples/cowsay/README.md
@@ -7,7 +7,7 @@ We have three files:
 - [application.py](application.py) -- This contains a mainline to run the 
cowsay app as well as a function to export the app (for later use)
 - [requirements.txt](requirements.txt) -- Just the requirements. All this 
needs is Burr/Streamlit/cowsay
 - [streamlit_app.py](streamlit_app.py) -- This contains a simple Streamlit app 
to interact with the cow
-- [notebook.ipynb](notebook.ipynb) -- A notebook that helps show things. <a 
target="_blank" 
href="https://colab.research.google.com/github/dagworks-inc/burr/blob/main/examples/other-examples/cowsay/notebook.ipynb";>
+- [notebook.ipynb](notebook.ipynb) -- A notebook that helps show things. <a 
target="_blank" 
href="https://colab.research.google.com/github/apache/burr/blob/main/examples/other-examples/cowsay/notebook.ipynb";>
   <img src="https://colab.research.google.com/assets/colab-badge.svg"; 
alt="Open In Colab"/>
 </a>
 
diff --git a/examples/parallelism/README.md b/examples/parallelism/README.md
index 974e0b99..ef59c646 100644
--- a/examples/parallelism/README.md
+++ b/examples/parallelism/README.md
@@ -2,7 +2,7 @@
 
 In this example we go over Burr's parallelism capabilities. It is based on the 
documentation (https://burr.dagworks.io/concepts/parallelism/), demonstrating 
the `MapStates` capabilities.
 
-See [the notebook](./notebook.ipynb) for the full example. Or <a 
target="_blank" 
href="https://colab.research.google.com/github/dagworks-inc/burr/blob/main/examples/parallelism/notebook.ipynb";>
+See [the notebook](./notebook.ipynb) for the full example. Or <a 
target="_blank" 
href="https://colab.research.google.com/github/apache/burr/blob/main/examples/parallelism/notebook.ipynb";>
   <img src="https://colab.research.google.com/assets/colab-badge.svg"; 
alt="Open In Colab"/>
 </a>
 
diff --git a/examples/parallelism/notebook.ipynb 
b/examples/parallelism/notebook.ipynb
index ac2f3391..2a775c66 100644
--- a/examples/parallelism/notebook.ipynb
+++ b/examples/parallelism/notebook.ipynb
@@ -5,10 +5,10 @@
    "id": "f4b744ec-ce8d-4e6b-b818-d86f6a869028",
    "metadata": {},
    "source": [
-    "<a target=\"_blank\" 
href=\"https://colab.research.google.com/github/dagworks-inc/burr/blob/main/examples/parallelism/notebook.ipynb\";>\n",
+    "<a target=\"_blank\" 
href=\"https://colab.research.google.com/github/apache/burr/blob/main/examples/parallelism/notebook.ipynb\";>\n",
     "  <img src=\"https://colab.research.google.com/assets/colab-badge.svg\"; 
alt=\"Open In Colab\"/>\n",
     "</a> \n",
-    "or <a target=\"_blank\" 
href=\"https://www.github.com/dagworks-inc/burr/tree/main/examples/parallelism/notebook.ipynb\";>view
 source</a>\n",
+    "or <a target=\"_blank\" 
href=\"https://www.github.com/apache/burr/tree/main/examples/parallelism/notebook.ipynb\";>view
 source</a>\n",
     "\n",
     "For a video walkthrough of this notebook <a 
href=\"https://youtu.be/G7lw63IBSmY\";>click here</a>."
    ]
diff --git a/examples/simple-chatbot-intro/README.md 
b/examples/simple-chatbot-intro/README.md
index 62dd8dfa..9c021d1d 100644
--- a/examples/simple-chatbot-intro/README.md
+++ b/examples/simple-chatbot-intro/README.md
@@ -12,6 +12,6 @@ Run the notebook:
 jupyter notebook
 ```
 
-Then open `notebook.ipynb` and run the cells. Or <a target="_blank" 
href="https://colab.research.google.com/github/dagworks-inc/burr/blob/main/examples/simple-chatbot-intro/notebook.ipynb";>
+Then open `notebook.ipynb` and run the cells. Or <a target="_blank" 
href="https://colab.research.google.com/github/apache/burr/blob/main/examples/simple-chatbot-intro/notebook.ipynb";>
   <img src="https://colab.research.google.com/assets/colab-badge.svg"; 
alt="Open In Colab"/>
 </a>
diff --git a/examples/simulation/README.md b/examples/simulation/README.md
index 0dc0585d..4c669332 100644
--- a/examples/simulation/README.md
+++ b/examples/simulation/README.md
@@ -1,6 +1,6 @@
 # Simulations
 
-This example is a WIP -- we're actively looking for contributors + ideas. See 
[this issue](https://github.com/DAGWorks-Inc/burr/issues/136) to track.
+This example is a WIP -- we're actively looking for contributors + ideas. See 
[this issue](https://github.com/apache/burr/issues/136) to track.
 
 At a high level, simulations generally run over a set of time steps and 
maintain state. The user then manages the state, which becomes
 the input to the next time step, as well as output data to analyze.
@@ -37,4 +37,4 @@ For multiple independent "agents", Burr could help model the 
way they interact.
 actions, or an action that loops over all "users". We are still figuring out 
the best way to model this, so reach out if you have ideas!
 
 
-Please comment at [this 
issue](https://github.com/DAGWorks-Inc/burr/issues/136) if you have any 
opinions on the above! We would love user-contributed examples.
+Please comment at [this issue](https://github.com/apache/burr/issues/136) if 
you have any opinions on the above! We would love user-contributed examples.
diff --git a/examples/streaming-fastapi/notebook.ipynb 
b/examples/streaming-fastapi/notebook.ipynb
index 5bb50255..064d0da7 100644
--- a/examples/streaming-fastapi/notebook.ipynb
+++ b/examples/streaming-fastapi/notebook.ipynb
@@ -23,7 +23,7 @@
     "This notebook only shows the streaming side. To check out FastAPI in 
Burr, check out\n",
     "- The [Burr code](./application.py) -- imported and used here\n",
     "- The [backend FastAPI server](./server.py) for the streaming output 
using SSE\n",
-    "- The [frontend typescript 
code](https://github.com/dagworks-inc/burr/blob/main/telemetry/ui/src/examples/StreamingChatbot.tsx)
 that renders and interacts with the stream\n",
+    "- The [frontend typescript 
code](https://github.com/apache/burr/blob/main/telemetry/ui/src/examples/StreamingChatbot.tsx)
 that renders and interacts with the stream\n",
     "\n",
     "You can view this demo in your app by running Burr:\n",
     "\n",
diff --git a/examples/streaming-overview/README.md 
b/examples/streaming-overview/README.md
index f1c33bd2..98cb7eb5 100644
--- a/examples/streaming-overview/README.md
+++ b/examples/streaming-overview/README.md
@@ -19,6 +19,6 @@ which demonstrates how to use streaming async. We have not 
hooked this up
 to a streamlit application yet, but that should be trivial.
 
 ## Notebook
-The notebook also shows how things work. <a target="_blank" 
href="https://colab.research.google.com/github/dagworks-inc/burr/blob/main/examples/streaming-overview/notebook.ipynb";>
+The notebook also shows how things work. <a target="_blank" 
href="https://colab.research.google.com/github/apache/burr/blob/main/examples/streaming-overview/notebook.ipynb";>
   <img src="https://colab.research.google.com/assets/colab-badge.svg"; 
alt="Open In Colab"/>
 </a>
diff --git a/examples/test-case-creation/README.md 
b/examples/test-case-creation/README.md
index 9bce2807..e821ff0d 100644
--- a/examples/test-case-creation/README.md
+++ b/examples/test-case-creation/README.md
@@ -47,7 +47,7 @@ In `test_application.py` you'll find examples tests for a 
simple action
 that is found in `application.py`.
 
 ## Notebook
-The notebook also shows how things work. <a target="_blank" 
href="https://colab.research.google.com/github/dagworks-inc/burr/blob/main/examples/test-case-creation/notebook.ipynb";>
+The notebook also shows how things work. <a target="_blank" 
href="https://colab.research.google.com/github/apache/burr/blob/main/examples/test-case-creation/notebook.ipynb";>
   <img src="https://colab.research.google.com/assets/colab-badge.svg"; 
alt="Open In Colab"/>
 </a>
 
diff --git a/examples/tracing-and-spans/README.md 
b/examples/tracing-and-spans/README.md
index 252ebd9f..5381b22e 100644
--- a/examples/tracing-and-spans/README.md
+++ b/examples/tracing-and-spans/README.md
@@ -11,6 +11,6 @@ These traces are used in the Burr UI. E.G. as follows:
 
 ![tracing](tracing_screencap.png)
 
-The notebook also shows how things work. <a target="_blank" 
href="https://colab.research.google.com/github/dagworks-inc/burr/blob/main/examples/tracing-and-spans/notebook.ipynb";>
+The notebook also shows how things work. <a target="_blank" 
href="https://colab.research.google.com/github/apache/burr/blob/main/examples/tracing-and-spans/notebook.ipynb";>
   <img src="https://colab.research.google.com/assets/colab-badge.svg"; 
alt="Open In Colab"/>
 </a>
diff --git a/examples/tracing-and-spans/burr_otel_demo.ipynb 
b/examples/tracing-and-spans/burr_otel_demo.ipynb
index bbb87458..a391ac96 100644
--- a/examples/tracing-and-spans/burr_otel_demo.ipynb
+++ b/examples/tracing-and-spans/burr_otel_demo.ipynb
@@ -12,7 +12,7 @@
     "<img 
src=\"https://github.com/user-attachments/assets/2ab9b499-7ca2-4ae9-af72-ccc775f30b4e\";
 width=\"100\" align=\"left\" /> + \n",
     "<img src=\"https://cdn.mos.cms.futurecdn.net/VgGxJABA8DcfAMpPPwdv6a.jpg\"; 
width=\"200\" align=\"center\"/>\n",
     "\n",
-    
"[https://github.com/dagworks-inc/burr](https://github.com/dagworks-inc/burr) 
by DAGWorks Inc. (YCW23 & StartX).\n",
+    "[https://github.com/apache/burr](https://github.com/apache/burr) by 
DAGWorks Inc. (YCW23 & StartX).\n",
     "\n",
     "Take🏠:\n",
     "\n",
@@ -949,7 +949,7 @@
     "\n",
     "[Link to video walking through this 
notebook](https://youtu.be/hqutVJyd3TI).\n",
     "\n",
-    
"[https://github.com/dagworks-inc/burr](https://github.com/dagworks-inc/burr)\n",
+    "[https://github.com/apache/burr](https://github.com/apache/burr)\n",
     "<img src=\"burr_qrcode.png\" width=\"125\"/>\n",
     "\n",
     "[Time Travel blog post & 
video:](https://blog.dagworks.io/p/travel-back-in-time-with-burr)\n",
@@ -960,10 +960,10 @@
     "\n",
     "More blogs @ `blog.dagworks.io` e.g. [async & 
streaming](https://blog.dagworks.io/p/streaming-chatbot-with-burr-fastapi)\n",
     "\n",
-    "More 
[examples](https://github.com/DAGWorks-Inc/burr/tree/main/examples/):\n",
+    "More [examples](https://github.com/apache/burr/tree/main/examples/):\n",
     "\n",
     "- e.g. [test case 
creation](https://burr.dagworks.io/examples/guardrails/creating_tests/)\n",
-    "- e.g. [multi-agent 
collaboration](https://github.com/DAGWorks-Inc/burr/tree/main/examples/multi-agent-collaboration)\n",
+    "- e.g. [multi-agent 
collaboration](https://github.com/apache/burr/tree/main/examples/multi-agent-collaboration)\n",
     "\n",
     "Follow on Twitter & LinkedIn:\n",
     "\n",

Reply via email to