codeant-ai-for-open-source[bot] commented on code in PR #35810:
URL: https://github.com/apache/superset/pull/35810#discussion_r2584005107


##########
superset-frontend/src/components/Datasource/components/DatasourceEditor/components/DatasetUsageTab/index.tsx:
##########
@@ -117,14 +125,22 @@ const DatasetUsageTab = ({
 
       try {
         await onFetchCharts(page, PAGE_SIZE, column, direction);
-        setCurrentPage(page);
-        setSortColumn(column);
-        setSortDirection(direction);
+
+        if (isMountedRef.current) {
+          setCurrentPage(page);
+          setSortColumn(column);
+          setSortDirection(direction);
+        }
       } catch (error) {
-        if (addDangerToastRef.current)
+        if ((error as Error).name === 'AbortError') return;
+
+        if (addDangerToastRef.current) {
           addDangerToastRef.current(t('Error fetching charts'));
+        }
       } finally {
-        setLoading(false);
+        if (isMountedRef.current) {

Review Comment:
   **Suggestion:** When multiple fetches overlap and earlier requests are 
cancelled via AbortError, the `finally` block still runs and calls 
`setLoading(false)`, which can incorrectly clear the loading state while a 
newer request is still in flight; moving the loading cleanup out of `finally` 
and guarding it against AbortError ensures cancelled requests do not race and 
override the loading state of active ones. [race condition]
   
   **Severity Level:** Minor ⚠️
   ```suggestion
             setLoading(false);
           }
         } catch (error) {
           const isAbortError =
             (error as { name?: string } | null)?.name === 'AbortError';
   
           if (!isAbortError && addDangerToastRef.current) {
             addDangerToastRef.current(t('Error fetching charts'));
           }
   
           if (!isAbortError && isMountedRef.current) {
   ```
   <details>
   <summary><b>Why it matters? ⭐ </b></summary>
   
   The current code explicitly checks for `AbortError` in the `catch` block but
   still always runs the `finally` block, which calls `setLoading(false)` even
   for aborted requests. Because `finally` runs after the `return` in `catch`,
   an older, aborted request can flip `loading` back to `false` while a newer
   request is still in flight, producing an incorrect UI state. The suggested
   change moves the loading cleanup into the success and error paths and
   skips it for `AbortError`, which directly addresses this real race
   condition without introducing syntax or logic errors.
   </details>
   <details>
   <summary><b>Prompt for AI Agent 🤖 </b></summary>
   
   ```mdx
   This is a comment left during a code review.
   
   **Path:** 
superset-frontend/src/components/Datasource/components/DatasourceEditor/components/DatasetUsageTab/index.tsx
   **Line:** 135:141
   **Comment:**
        *Race Condition: When multiple fetches overlap and earlier requests are 
cancelled via AbortError, the `finally` block still runs and calls 
`setLoading(false)`, which can incorrectly clear the loading state while a 
newer request is still in flight; moving the loading cleanup out of `finally` 
and guarding it against AbortError ensures cancelled requests do not race and 
override the loading state of active ones.
   
   Validate the correctness of the flagged issue. If correct, How can I resolve 
this? If you propose a fix, implement it and please make it concise.
   ```
   </details>



##########
superset-frontend/src/dashboard/components/nativeFilters/FiltersConfigModal/FiltersConfigModal.test.tsx:
##########
@@ -181,6 +181,8 @@ beforeAll(() => {
 });
 
 afterEach(() => {
+  jest.runOnlyPendingTimers();
+  jest.useRealTimers();

Review Comment:
   **Suggestion:** The global `afterEach` now unconditionally calls 
`jest.runOnlyPendingTimers()` and `jest.useRealTimers()` for every test, but 
most tests end with real timers (and some already restore timers themselves), 
so `jest.runOnlyPendingTimers()` will be invoked when timers are not mocked, 
which in Jest throws a runtime error ("Timers are not mocked") and will cause 
tests to fail; removing these global timer calls and leaving timer management 
to the individual tests avoids this failure. [possible bug]
   
   **Severity Level:** Critical 🚨
   ```suggestion
   
   ```
   <details>
   <summary><b>Why it matters? ⭐ </b></summary>
   
   The global afterEach unconditionally calls jest.runOnlyPendingTimers() and
   jest.useRealTimers() for every test in this file. Most tests never call
   jest.useFakeTimers(), and there is no global fake-timer setup (no
   useFakeTimers in setupTests), so those tests run with real timers. In Jest,
   invoking runOnlyPendingTimers() when timers are not mocked throws
   ("Timers are not mocked"), which is a real runtime failure of the test
   suite, not a stylistic concern. Tests that do use fake timers already
   manage timers explicitly with useFakeTimers()/useRealTimers() in their own
   bodies, so removing the global timer calls does not break them; it just
   avoids calling runOnlyPendingTimers() under real timers. The improved code
   keeps the necessary jest.restoreAllMocks() while eliminating the
   problematic timer calls, directly addressing a genuine bug.
   </details>
   <details>
   <summary><b>Prompt for AI Agent 🤖 </b></summary>
   
   ```mdx
   This is a comment left during a code review.
   
   **Path:** 
superset-frontend/src/dashboard/components/nativeFilters/FiltersConfigModal/FiltersConfigModal.test.tsx
   **Line:** 184:185
   **Comment:**
        *Possible Bug: The global `afterEach` now unconditionally calls 
`jest.runOnlyPendingTimers()` and `jest.useRealTimers()` for every test, but 
most tests end with real timers (and some already restore timers themselves), 
so `jest.runOnlyPendingTimers()` will be invoked when timers are not mocked, 
which in Jest throws a runtime error ("Timers are not mocked") and will cause 
tests to fail; removing these global timer calls and leaving timer management 
to the individual tests avoids this failure.
   
   Validate the correctness of the flagged issue. If correct, How can I resolve 
this? If you propose a fix, implement it and please make it concise.
   ```
   </details>



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to