korbit-ai[bot] commented on code in PR #35478: URL: https://github.com/apache/superset/pull/35478#discussion_r2410737632
########## superset-frontend/src/components/StreamingExportModal/StreamingExportModal.tsx: ########## @@ -0,0 +1,215 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import { styled, t } from '@superset-ui/core'; +import { Modal, Button, Typography, Progress } from 'antd'; + +const { Text } = Typography; + +export enum ExportStatus { + STREAMING = 'streaming', + COMPLETED = 'completed', + ERROR = 'error', + CANCELLED = 'cancelled', +} + +export interface StreamingProgress { + totalRows?: number; + rowsProcessed: number; + totalSize: number; + status: ExportStatus; + downloadUrl?: string; + error?: string; + filename?: string; + speed?: number; + mbPerSecond?: number; + elapsedTime?: number; +} + +interface StreamingExportModalProps { + visible: boolean; + onCancel: () => void; + onRetry?: () => void; + progress: StreamingProgress; + exportType: 'csv' | 'xlsx'; Review Comment: ### Unused exportType prop <sub></sub> <details> <summary>Tell me more</summary> ###### What is the issue? The exportType prop is defined in the interface but never used in the component implementation. ###### Why this matters This creates a disconnect between the component's interface and its actual behavior, potentially misleading developers about the component's capabilities and making it harder to extend for different export types. ###### Suggested change ∙ *Feature Preview* Either remove the unused prop from the interface or implement logic to handle different export types: ```typescript // Option 1: Remove unused prop interface StreamingExportModalProps { visible: boolean; onCancel: () => void; onRetry?: () => void; progress: StreamingProgress; } // Option 2: Use the prop in the component const getDefaultFilename = () => { return exportType === 'xlsx' ? 'export.xlsx' : 'export.csv'; }; ``` ###### Provide feedback to improve future suggestions [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/2d07a5cb-d66f-4468-a9b4-b08c17670e53/upvote) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/2d07a5cb-d66f-4468-a9b4-b08c17670e53?what_not_true=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/2d07a5cb-d66f-4468-a9b4-b08c17670e53?what_out_of_scope=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/2d07a5cb-d66f-4468-a9b4-b08c17670e53?what_not_in_standard=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/2d07a5cb-d66f-4468-a9b4-b08c17670e53) </details> <sub> 💬 Looking for more details? Reply to this comment to chat with Korbit. </sub> <!--- korbi internal id:cec4fa32-c247-4ced-808c-ae1673228a40 --> [](cec4fa32-c247-4ced-808c-ae1673228a40) ########## superset-frontend/src/components/StreamingExportModal/StreamingExportModal.tsx: ########## @@ -0,0 +1,215 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import { styled, t } from '@superset-ui/core'; +import { Modal, Button, Typography, Progress } from 'antd'; + +const { Text } = Typography; + +export enum ExportStatus { + STREAMING = 'streaming', + COMPLETED = 'completed', + ERROR = 'error', + CANCELLED = 'cancelled', +} + +export interface StreamingProgress { + totalRows?: number; + rowsProcessed: number; + totalSize: number; + status: ExportStatus; + downloadUrl?: string; + error?: string; + filename?: string; + speed?: number; + mbPerSecond?: number; + elapsedTime?: number; +} + +interface StreamingExportModalProps { + visible: boolean; + onCancel: () => void; + onRetry?: () => void; + progress: StreamingProgress; + exportType: 'csv' | 'xlsx'; +} + +const ModalContent = styled.div` + padding: ${({ theme }) => theme.sizeUnit * 4}px 0 + ${({ theme }) => theme.sizeUnit * 2}px; +`; + +const ProgressSection = styled.div` + margin: ${({ theme }) => theme.sizeUnit * 6}px 0; +`; + +const ActionButtons = styled.div` + display: flex; + gap: ${({ theme }) => theme.sizeUnit * 2}px; + justify-content: flex-end; +`; + +const ProgressText = styled(Text)` + display: block; + text-align: center; + margin-top: ${({ theme }) => theme.sizeUnit * 4}px; +`; + +const ErrorText = styled(Text)` + display: block; + text-align: center; + margin-top: ${({ theme }) => theme.sizeUnit * 4}px; +`; + +const CancelButton = styled(Button)``; + +const DownloadButton = styled(Button)``; + +const StreamingExportModal = ({ + visible, + onCancel, + onRetry, + progress, +}: StreamingExportModalProps) => { + const { status, downloadUrl, filename, error } = progress; + + const getProgressPercentage = (): number => { + if (status === ExportStatus.COMPLETED) return 100; + if (progress.totalRows && progress.totalRows > 0) { + const percentage = Math.min( + 99, + (progress.rowsProcessed / progress.totalRows) * 100, + ); + return Math.round(percentage); + } + return 0; + }; + + const handleDownload = () => { + if (downloadUrl) { + const link = document.createElement('a'); + link.href = downloadUrl; + link.download = filename || 'export.csv'; + document.body.appendChild(link); + link.click(); + document.body.removeChild(link); + onCancel(); + } + }; + + let content; + if (status === ExportStatus.ERROR) { + content = ( + <ModalContent> + <ProgressSection> + <Progress percent={0} status="exception" showInfo={false} /> + <ErrorText type="danger">{error || t('Export failed')}</ErrorText> + </ProgressSection> + <ActionButtons> + <CancelButton onClick={onCancel}>{t('Close')}</CancelButton> + {onRetry && ( + <DownloadButton type="primary" onClick={onRetry}> + {t('Retry')} + </DownloadButton> + )} + </ActionButtons> + </ModalContent> + ); + } else if (status === ExportStatus.CANCELLED) { + content = ( + <ModalContent> + <ProgressSection> + <Progress + percent={getProgressPercentage()} + status="exception" + showInfo={false} + /> + <ProgressText>{t('Export cancelled')}</ProgressText> + </ProgressSection> + <ActionButtons> + <CancelButton onClick={onCancel}>{t('Close')}</CancelButton> + {onRetry && ( + <DownloadButton type="primary" onClick={onRetry}> + {t('Retry')} + </DownloadButton> + )} + </ActionButtons> + </ModalContent> + ); + } else if (status === ExportStatus.COMPLETED) { + content = ( + <ModalContent> + <ProgressSection> + <Progress percent={100} status="success" showInfo={false} /> + <ProgressText> + {t('Export successful: %s', filename || 'export.csv')} + </ProgressText> + </ProgressSection> + <ActionButtons> + <CancelButton onClick={onCancel}>{t('Close')}</CancelButton> + <DownloadButton + type="primary" + onClick={handleDownload} + disabled={!downloadUrl} + > + {t('Download')} + </DownloadButton> + </ActionButtons> + </ModalContent> + ); + } else { + content = ( + <ModalContent> + <ProgressSection> + <Progress + percent={getProgressPercentage()} + status="active" + showInfo + format={percent => `${Math.round(percent || 0)}%`} + /> + <ProgressText> + {filename + ? t('Processing export for %s', filename) + : t('Processing export...')} + </ProgressText> + </ProgressSection> + <ActionButtons> + <CancelButton onClick={onCancel}>{t('Cancel')}</CancelButton> + <DownloadButton type="primary" disabled> + {t('Download')} + </DownloadButton> + </ActionButtons> + </ModalContent> + ); + } Review Comment: ### Modal content rendering needs componentization <sub></sub> <details> <summary>Tell me more</summary> ###### What is the issue? The modal content rendering logic uses a large if-else block with significant code duplication across different status conditions. ###### Why this matters This approach makes the code harder to maintain and extend. Each status branch contains similar structure with duplicated components, making it difficult to make consistent changes across all states. ###### Suggested change ∙ *Feature Preview* Extract common modal content structure into reusable components and use a status-based mapping: ```typescript const StatusContent = { [ExportStatus.ERROR]: ({ error, onCancel, onRetry }) => ( <ModalContent> <ProgressSection> <Progress percent={0} status="exception" showInfo={false} /> <ErrorText type="danger">{error || t('Export failed')}</ErrorText> </ProgressSection> <ActionButtons {...{ onCancel, onRetry }} /> </ModalContent> ), // ... similar for other statuses }; return <Modal>{StatusContent[status](props)}</Modal>; ``` ###### Provide feedback to improve future suggestions [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/d1e76e1e-962e-4bbe-beca-cb3eb809bff5/upvote) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/d1e76e1e-962e-4bbe-beca-cb3eb809bff5?what_not_true=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/d1e76e1e-962e-4bbe-beca-cb3eb809bff5?what_out_of_scope=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/d1e76e1e-962e-4bbe-beca-cb3eb809bff5?what_not_in_standard=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/d1e76e1e-962e-4bbe-beca-cb3eb809bff5) </details> <sub> 💬 Looking for more details? Reply to this comment to chat with Korbit. </sub> <!--- korbi internal id:b60d57f5-2d8f-4d47-a9b3-7add3e6402be --> [](b60d57f5-2d8f-4d47-a9b3-7add3e6402be) ########## superset-frontend/src/components/StreamingExportModal/useStreamingExport.ts: ########## @@ -0,0 +1,217 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import { useState, useCallback, useRef } from 'react'; +import { ExportStatus, StreamingProgress } from './StreamingExportModal'; + +interface UseStreamingExportOptions { + onComplete?: (downloadUrl: string, filename: string) => void; + onError?: (error: string) => void; +} + +interface StreamingExportParams { + url: string; + payload: any; + filename?: string; + exportType: 'csv' | 'xlsx'; + expectedRows?: number; +} + +export const useStreamingExport = (options: UseStreamingExportOptions = {}) => { + const [progress, setProgress] = useState<StreamingProgress>({ + rowsProcessed: 0, + totalRows: undefined, + totalSize: 0, + speed: 0, + mbPerSecond: 0, + elapsedTime: 0, + status: ExportStatus.STREAMING, + }); + const [isExporting, setIsExporting] = useState(false); + const abortControllerRef = useRef<AbortController | null>(null); + + const updateProgress = useCallback((updates: Partial<StreamingProgress>) => { + setProgress(prev => ({ ...prev, ...updates })); + }, []); + + const startExport = useCallback( + async ({ + url, + payload, + filename, + exportType, + expectedRows, + }: StreamingExportParams) => { + if (isExporting) return; + + setIsExporting(true); + abortControllerRef.current = new AbortController(); + + updateProgress({ + rowsProcessed: 0, + totalRows: expectedRows, + totalSize: 0, + speed: 0, + mbPerSecond: 0, + elapsedTime: 0, + status: ExportStatus.STREAMING, + filename, + }); + + try { + const response = await fetch(url, { + method: 'POST', + headers: { + 'Content-Type': 'application/x-www-form-urlencoded', + }, + body: new URLSearchParams({ + form_data: JSON.stringify(payload), + filename: filename || `export.${exportType}`, + expected_rows: expectedRows?.toString() || '', + }), + signal: abortControllerRef.current.signal, + credentials: 'same-origin', + }); + + if (!response.ok) { + throw new Error( + `Export failed: ${response.status} ${response.statusText}`, + ); Review Comment: ### Large function with multiple responsibilities <sub></sub> <details> <summary>Tell me more</summary> ###### What is the issue? The startExport function is too large and handles multiple responsibilities including request setup, streaming, progress tracking, and error handling. ###### Why this matters Large functions with multiple responsibilities are harder to maintain, test, and understand. They violate the Single Responsibility Principle and make the code less modular. ###### Suggested change ∙ *Feature Preview* Extract the streaming logic into separate functions: ```typescript const handleStreamingResponse = async (reader: ReadableStreamDefaultReader, expectedRows: number) => { // Handle streaming logic }; const createDownloadBlob = (data: Uint8Array, exportType: 'csv' | 'xlsx') => { // Handle blob creation }; const startExport = useCallback(async (params: StreamingExportParams) => { const response = await setupExportRequest(params); const streamedData = await handleStreamingResponse(response.body.getReader(), params.expectedRows); const downloadUrl = createDownloadBlob(streamedData, params.exportType); handleExportCompletion(downloadUrl, params.filename); }, []); ``` ###### Provide feedback to improve future suggestions [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/319bab71-831e-472e-a958-adab45c2a2f2/upvote) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/319bab71-831e-472e-a958-adab45c2a2f2?what_not_true=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/319bab71-831e-472e-a958-adab45c2a2f2?what_out_of_scope=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/319bab71-831e-472e-a958-adab45c2a2f2?what_not_in_standard=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/319bab71-831e-472e-a958-adab45c2a2f2) </details> <sub> 💬 Looking for more details? Reply to this comment to chat with Korbit. </sub> <!--- korbi internal id:85bca7c6-01a8-4209-bbdb-fe0928f4fbc6 --> [](85bca7c6-01a8-4209-bbdb-fe0928f4fbc6) ########## superset/commands/chart/data/streaming_export_command.py: ########## @@ -0,0 +1,169 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +"""Command for streaming CSV exports of large datasets.""" + +from __future__ import annotations + +import logging +import time +from typing import Callable, Generator, TYPE_CHECKING + +from flask import current_app as app + +from superset.commands.base import BaseCommand + +if TYPE_CHECKING: + from superset.common.query_context import QueryContext + +logger = logging.getLogger(__name__) + + +class StreamingCSVExportCommand(BaseCommand): + """ + Command to execute a streaming CSV export. + + This command handles the business logic for: + - Executing database queries with server-side cursors + - Generating CSV data in chunks + - Managing database connections + - Buffering data for efficient streaming + """ + + def __init__( + self, + query_context: QueryContext, + chunk_size: int = 1000, + ): + """ + Initialize the streaming export command. + + Args: + query_context: The query context containing datasource and query details + chunk_size: Number of rows to fetch per database query (default: 1000) + """ + self._query_context = query_context + self._chunk_size = chunk_size + self._current_app = app._get_current_object() + + def validate(self) -> None: + """Validate permissions and query context.""" + self._query_context.raise_for_access() + + def run(self) -> Callable[[], Generator[str, None, None]]: + """ + Execute the streaming CSV export. + + Returns: + A callable that returns a generator yielding CSV data chunks as strings. + The callable is needed to maintain Flask app context during streaming. + """ + + def csv_generator() -> Generator[str, None, None]: + """Generator that yields CSV data from database query.""" + with self._current_app.app_context(): + start_time = time.time() + total_bytes = 0 + + try: + from superset import db + from superset.connectors.sqla.models import SqlaTable + + datasource = self._query_context.datasource + + with db.session() as session: + if isinstance(datasource, SqlaTable): + datasource = session.merge(datasource) + + query_obj = self._query_context.queries[0] + sql_query = datasource.get_query_str(query_obj.to_dict()) + + with datasource.database.get_sqla_engine() as engine: + connection = engine.connect() + + try: + from sqlalchemy import text + + result_proxy = connection.execution_options( + stream_results=True + ).execute(text(sql_query)) + + columns = list(result_proxy.keys()) + + # Yield CSV header + header_row = ( + ",".join(f'"{col}"' for col in columns) + "\n" + ) + total_bytes += len(header_row.encode("utf-8")) + yield header_row + + row_count = 0 + buffer = [] + buffer_size = 0 + flush_threshold = 65536 # 64KB + + while True: + rows = result_proxy.fetchmany(self._chunk_size) + if not rows: + break + + for row in rows: + csv_row = ",".join( + f'"{str(cell) if cell is not None else ""}"' + for cell in row + ) Review Comment: ### Improper CSV quote escaping <sub></sub> <details> <summary>Tell me more</summary> ###### What is the issue? CSV escaping is incorrect - quotes within cell values are not properly escaped, breaking CSV format. ###### Why this matters If cell values contain quotes, the generated CSV will be malformed and unparseable by CSV readers, causing data corruption or import failures. ###### Suggested change ∙ *Feature Preview* Use proper CSV escaping by doubling internal quotes: ```python def escape_csv_value(cell): if cell is None: return '""' cell_str = str(cell) # Escape quotes by doubling them escaped = cell_str.replace('"', '""') return f'"{escaped}"' csv_row = ",".join(escape_csv_value(cell) for cell in row) ``` ###### Provide feedback to improve future suggestions [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/a7fd517a-698a-4d8c-8adc-ed53022104db/upvote) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/a7fd517a-698a-4d8c-8adc-ed53022104db?what_not_true=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/a7fd517a-698a-4d8c-8adc-ed53022104db?what_out_of_scope=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/a7fd517a-698a-4d8c-8adc-ed53022104db?what_not_in_standard=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/a7fd517a-698a-4d8c-8adc-ed53022104db) </details> <sub> 💬 Looking for more details? Reply to this comment to chat with Korbit. </sub> <!--- korbi internal id:7a0fde4b-d907-4916-8e02-b7aa93e782ce --> [](7a0fde4b-d907-4916-8e02-b7aa93e782ce) ########## superset-frontend/src/components/StreamingExportModal/StreamingExportModal.tsx: ########## @@ -0,0 +1,215 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import { styled, t } from '@superset-ui/core'; +import { Modal, Button, Typography, Progress } from 'antd'; + +const { Text } = Typography; + +export enum ExportStatus { + STREAMING = 'streaming', + COMPLETED = 'completed', + ERROR = 'error', + CANCELLED = 'cancelled', +} + +export interface StreamingProgress { + totalRows?: number; + rowsProcessed: number; + totalSize: number; + status: ExportStatus; + downloadUrl?: string; + error?: string; + filename?: string; + speed?: number; + mbPerSecond?: number; + elapsedTime?: number; +} + +interface StreamingExportModalProps { + visible: boolean; + onCancel: () => void; + onRetry?: () => void; + progress: StreamingProgress; + exportType: 'csv' | 'xlsx'; +} + +const ModalContent = styled.div` + padding: ${({ theme }) => theme.sizeUnit * 4}px 0 + ${({ theme }) => theme.sizeUnit * 2}px; +`; + +const ProgressSection = styled.div` + margin: ${({ theme }) => theme.sizeUnit * 6}px 0; +`; + +const ActionButtons = styled.div` + display: flex; + gap: ${({ theme }) => theme.sizeUnit * 2}px; + justify-content: flex-end; +`; + +const ProgressText = styled(Text)` + display: block; + text-align: center; + margin-top: ${({ theme }) => theme.sizeUnit * 4}px; +`; + +const ErrorText = styled(Text)` + display: block; + text-align: center; + margin-top: ${({ theme }) => theme.sizeUnit * 4}px; +`; + +const CancelButton = styled(Button)``; + +const DownloadButton = styled(Button)``; + +const StreamingExportModal = ({ + visible, + onCancel, + onRetry, + progress, +}: StreamingExportModalProps) => { + const { status, downloadUrl, filename, error } = progress; + + const getProgressPercentage = (): number => { + if (status === ExportStatus.COMPLETED) return 100; + if (progress.totalRows && progress.totalRows > 0) { + const percentage = Math.min( + 99, + (progress.rowsProcessed / progress.totalRows) * 100, + ); + return Math.round(percentage); + } + return 0; + }; + + const handleDownload = () => { + if (downloadUrl) { + const link = document.createElement('a'); + link.href = downloadUrl; + link.download = filename || 'export.csv'; Review Comment: ### Hardcoded CSV extension in fallback filename <sub></sub> <details> <summary>Tell me more</summary> ###### What is the issue? The hardcoded fallback filename 'export.csv' doesn't respect the exportType prop, always defaulting to CSV extension regardless of the actual export format. ###### Why this matters This could cause confusion for users when downloading XLSX files that get a .csv extension, potentially leading to file opening issues or user confusion about the actual file format. ###### Suggested change ∙ *Feature Preview* Use the exportType prop to determine the correct file extension: ```typescript const getDefaultFilename = () => { return exportType === 'xlsx' ? 'export.xlsx' : 'export.csv'; }; const handleDownload = () => { if (downloadUrl) { const link = document.createElement('a'); link.href = downloadUrl; link.download = filename || getDefaultFilename(); document.body.appendChild(link); link.click(); document.body.removeChild(link); onCancel(); } }; ``` ###### Provide feedback to improve future suggestions [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/fe3a567d-c40b-42c0-ba75-2a2022617d24/upvote) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/fe3a567d-c40b-42c0-ba75-2a2022617d24?what_not_true=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/fe3a567d-c40b-42c0-ba75-2a2022617d24?what_out_of_scope=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/fe3a567d-c40b-42c0-ba75-2a2022617d24?what_not_in_standard=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/fe3a567d-c40b-42c0-ba75-2a2022617d24) </details> <sub> 💬 Looking for more details? Reply to this comment to chat with Korbit. </sub> <!--- korbi internal id:140f2a20-a124-42cc-aed8-d2289e90a10c --> [](140f2a20-a124-42cc-aed8-d2289e90a10c) ########## superset/commands/chart/data/streaming_export_command.py: ########## @@ -0,0 +1,169 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +"""Command for streaming CSV exports of large datasets.""" + +from __future__ import annotations + +import logging +import time +from typing import Callable, Generator, TYPE_CHECKING + +from flask import current_app as app + +from superset.commands.base import BaseCommand + +if TYPE_CHECKING: + from superset.common.query_context import QueryContext + +logger = logging.getLogger(__name__) + + +class StreamingCSVExportCommand(BaseCommand): + """ + Command to execute a streaming CSV export. + + This command handles the business logic for: + - Executing database queries with server-side cursors + - Generating CSV data in chunks + - Managing database connections + - Buffering data for efficient streaming + """ + + def __init__( + self, + query_context: QueryContext, + chunk_size: int = 1000, + ): + """ + Initialize the streaming export command. + + Args: + query_context: The query context containing datasource and query details + chunk_size: Number of rows to fetch per database query (default: 1000) + """ + self._query_context = query_context + self._chunk_size = chunk_size + self._current_app = app._get_current_object() + + def validate(self) -> None: + """Validate permissions and query context.""" + self._query_context.raise_for_access() + + def run(self) -> Callable[[], Generator[str, None, None]]: + """ + Execute the streaming CSV export. + + Returns: + A callable that returns a generator yielding CSV data chunks as strings. + The callable is needed to maintain Flask app context during streaming. + """ + + def csv_generator() -> Generator[str, None, None]: + """Generator that yields CSV data from database query.""" + with self._current_app.app_context(): + start_time = time.time() + total_bytes = 0 + + try: + from superset import db + from superset.connectors.sqla.models import SqlaTable + + datasource = self._query_context.datasource + + with db.session() as session: + if isinstance(datasource, SqlaTable): + datasource = session.merge(datasource) + + query_obj = self._query_context.queries[0] + sql_query = datasource.get_query_str(query_obj.to_dict()) + + with datasource.database.get_sqla_engine() as engine: + connection = engine.connect() + + try: + from sqlalchemy import text + + result_proxy = connection.execution_options( + stream_results=True + ).execute(text(sql_query)) + + columns = list(result_proxy.keys()) + + # Yield CSV header + header_row = ( + ",".join(f'"{col}"' for col in columns) + "\n" + ) + total_bytes += len(header_row.encode("utf-8")) + yield header_row + + row_count = 0 + buffer = [] + buffer_size = 0 + flush_threshold = 65536 # 64KB + + while True: + rows = result_proxy.fetchmany(self._chunk_size) + if not rows: + break + + for row in rows: + csv_row = ",".join( + f'"{str(cell) if cell is not None else ""}"' + for cell in row + ) + csv_line = csv_row + "\n" + row_bytes = len(csv_line.encode("utf-8")) + total_bytes += row_bytes + row_count += 1 + + buffer.append(csv_line) + buffer_size += row_bytes + + if buffer_size >= flush_threshold: + yield "".join(buffer) Review Comment: ### Inefficient buffer flushing with string concatenation <sub></sub> <details> <summary>Tell me more</summary> ###### What is the issue? Using a list buffer with string join for each flush creates O(n) string concatenation overhead when the buffer could be yielded incrementally. ###### Why this matters The join operation creates a new string containing all buffered rows, causing memory spikes and CPU overhead that defeats the purpose of streaming, especially when buffer sizes are large. ###### Suggested change ∙ *Feature Preview* Use a more efficient buffering strategy or yield individual rows: ```python from io import StringIO buffer = StringIO() for row in rows: csv_line = format_csv_row(row) buffer.write(csv_line) if buffer.tell() >= flush_threshold: yield buffer.getvalue() buffer = StringIO() ``` ###### Provide feedback to improve future suggestions [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/8ff8bee7-7507-4a9c-87f5-fb2eb00daa7c/upvote) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/8ff8bee7-7507-4a9c-87f5-fb2eb00daa7c?what_not_true=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/8ff8bee7-7507-4a9c-87f5-fb2eb00daa7c?what_out_of_scope=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/8ff8bee7-7507-4a9c-87f5-fb2eb00daa7c?what_not_in_standard=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/8ff8bee7-7507-4a9c-87f5-fb2eb00daa7c) </details> <sub> 💬 Looking for more details? Reply to this comment to chat with Korbit. </sub> <!--- korbi internal id:a4214305-a879-43e9-9670-a0ae95487d40 --> [](a4214305-a879-43e9-9670-a0ae95487d40) ########## superset/commands/chart/data/streaming_export_command.py: ########## @@ -0,0 +1,169 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +"""Command for streaming CSV exports of large datasets.""" + +from __future__ import annotations + +import logging +import time +from typing import Callable, Generator, TYPE_CHECKING + +from flask import current_app as app + +from superset.commands.base import BaseCommand + +if TYPE_CHECKING: + from superset.common.query_context import QueryContext + +logger = logging.getLogger(__name__) + + +class StreamingCSVExportCommand(BaseCommand): + """ + Command to execute a streaming CSV export. + + This command handles the business logic for: + - Executing database queries with server-side cursors + - Generating CSV data in chunks + - Managing database connections + - Buffering data for efficient streaming + """ + + def __init__( + self, + query_context: QueryContext, + chunk_size: int = 1000, + ): + """ + Initialize the streaming export command. + + Args: + query_context: The query context containing datasource and query details + chunk_size: Number of rows to fetch per database query (default: 1000) + """ + self._query_context = query_context + self._chunk_size = chunk_size + self._current_app = app._get_current_object() + + def validate(self) -> None: + """Validate permissions and query context.""" + self._query_context.raise_for_access() + + def run(self) -> Callable[[], Generator[str, None, None]]: + """ + Execute the streaming CSV export. + + Returns: + A callable that returns a generator yielding CSV data chunks as strings. + The callable is needed to maintain Flask app context during streaming. + """ + + def csv_generator() -> Generator[str, None, None]: + """Generator that yields CSV data from database query.""" + with self._current_app.app_context(): + start_time = time.time() + total_bytes = 0 + + try: + from superset import db + from superset.connectors.sqla.models import SqlaTable + + datasource = self._query_context.datasource + + with db.session() as session: + if isinstance(datasource, SqlaTable): + datasource = session.merge(datasource) + + query_obj = self._query_context.queries[0] + sql_query = datasource.get_query_str(query_obj.to_dict()) + + with datasource.database.get_sqla_engine() as engine: + connection = engine.connect() + + try: + from sqlalchemy import text + + result_proxy = connection.execution_options( + stream_results=True + ).execute(text(sql_query)) + + columns = list(result_proxy.keys()) + + # Yield CSV header + header_row = ( + ",".join(f'"{col}"' for col in columns) + "\n" + ) + total_bytes += len(header_row.encode("utf-8")) + yield header_row + + row_count = 0 + buffer = [] + buffer_size = 0 + flush_threshold = 65536 # 64KB + + while True: + rows = result_proxy.fetchmany(self._chunk_size) + if not rows: + break + + for row in rows: + csv_row = ",".join( + f'"{str(cell) if cell is not None else ""}"' + for cell in row + ) Review Comment: ### Inefficient CSV cell formatting in streaming loop <sub></sub> <details> <summary>Tell me more</summary> ###### What is the issue? String concatenation in a tight loop using join() with generator expression performs unnecessary string conversions and allocations for each cell. ###### Why this matters For large datasets, this creates significant CPU overhead and memory pressure due to repeated str() calls and string object creation for every cell, potentially negating the memory benefits of streaming. ###### Suggested change ∙ *Feature Preview* Use a more efficient approach by pre-allocating a list and avoiding repeated str() conversions: ```python csv_parts = [] for cell in row: if cell is None: csv_parts.append('""') else: csv_parts.append(f'"{cell}"') csv_row = ",".join(csv_parts) ``` ###### Provide feedback to improve future suggestions [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/48613e2a-0dc4-41fb-8797-b1df519bc348/upvote) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/48613e2a-0dc4-41fb-8797-b1df519bc348?what_not_true=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/48613e2a-0dc4-41fb-8797-b1df519bc348?what_out_of_scope=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/48613e2a-0dc4-41fb-8797-b1df519bc348?what_not_in_standard=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/48613e2a-0dc4-41fb-8797-b1df519bc348) </details> <sub> 💬 Looking for more details? Reply to this comment to chat with Korbit. </sub> <!--- korbi internal id:fa58ca17-d4e8-46b2-a874-daead0a25b90 --> [](fa58ca17-d4e8-46b2-a874-daead0a25b90) ########## superset-frontend/src/components/StreamingExportModal/useStreamingExport.ts: ########## @@ -0,0 +1,217 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import { useState, useCallback, useRef } from 'react'; +import { ExportStatus, StreamingProgress } from './StreamingExportModal'; + +interface UseStreamingExportOptions { + onComplete?: (downloadUrl: string, filename: string) => void; + onError?: (error: string) => void; +} + +interface StreamingExportParams { + url: string; + payload: any; + filename?: string; + exportType: 'csv' | 'xlsx'; + expectedRows?: number; +} + +export const useStreamingExport = (options: UseStreamingExportOptions = {}) => { + const [progress, setProgress] = useState<StreamingProgress>({ + rowsProcessed: 0, + totalRows: undefined, + totalSize: 0, + speed: 0, + mbPerSecond: 0, + elapsedTime: 0, + status: ExportStatus.STREAMING, + }); + const [isExporting, setIsExporting] = useState(false); + const abortControllerRef = useRef<AbortController | null>(null); + + const updateProgress = useCallback((updates: Partial<StreamingProgress>) => { + setProgress(prev => ({ ...prev, ...updates })); + }, []); + + const startExport = useCallback( + async ({ + url, + payload, + filename, + exportType, + expectedRows, + }: StreamingExportParams) => { + if (isExporting) return; + + setIsExporting(true); + abortControllerRef.current = new AbortController(); + + updateProgress({ + rowsProcessed: 0, + totalRows: expectedRows, + totalSize: 0, + speed: 0, + mbPerSecond: 0, + elapsedTime: 0, + status: ExportStatus.STREAMING, + filename, + }); + + try { + const response = await fetch(url, { + method: 'POST', + headers: { + 'Content-Type': 'application/x-www-form-urlencoded', + }, + body: new URLSearchParams({ + form_data: JSON.stringify(payload), + filename: filename || `export.${exportType}`, + expected_rows: expectedRows?.toString() || '', + }), + signal: abortControllerRef.current.signal, + credentials: 'same-origin', + }); + + if (!response.ok) { + throw new Error( + `Export failed: ${response.status} ${response.statusText}`, + ); + } + + if (!response.body) { + throw new Error('Response body is not available for streaming'); + } + + const reader = response.body.getReader(); + const chunks: Uint8Array[] = []; + let receivedLength = 0; + let rowsProcessed = 0; + const NEWLINE_BYTE = 10; // '\n' character code + + // eslint-disable-next-line no-constant-condition + while (true) { + // eslint-disable-next-line no-await-in-loop + const { done, value } = await reader.read(); + + if (done) break; + + if (abortControllerRef.current?.signal.aborted) { + throw new Error('Export cancelled by user'); + } + + chunks.push(value); + receivedLength += value.length; + + // Count newlines directly in binary (faster than decoding + regex) + let newlineCount = 0; + for (let i = 0; i < value.length; i += 1) { + if (value[i] === NEWLINE_BYTE) { + newlineCount += 1; + } + } Review Comment: ### Inefficient newline counting in large chunks <sub></sub> <details> <summary>Tell me more</summary> ###### What is the issue? Byte-by-byte iteration to count newlines is inefficient for large chunks and could be optimized using native array methods. ###### Why this matters This manual loop approach is slower than using optimized native methods, adding unnecessary processing overhead that accumulates significantly with large datasets. ###### Suggested change ∙ *Feature Preview* Use more efficient methods for counting newlines: ```typescript // More efficient approach using filter const newlineCount = value.filter(byte => byte === NEWLINE_BYTE).length; // Or using reduce for better performance with very large chunks const newlineCount = value.reduce((count, byte) => byte === NEWLINE_BYTE ? count + 1 : count, 0 ); ``` ###### Provide feedback to improve future suggestions [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/3c8cd078-3c5a-4aae-9a06-19d6f7cac16d/upvote) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/3c8cd078-3c5a-4aae-9a06-19d6f7cac16d?what_not_true=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/3c8cd078-3c5a-4aae-9a06-19d6f7cac16d?what_out_of_scope=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/3c8cd078-3c5a-4aae-9a06-19d6f7cac16d?what_not_in_standard=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/3c8cd078-3c5a-4aae-9a06-19d6f7cac16d) </details> <sub> 💬 Looking for more details? Reply to this comment to chat with Korbit. </sub> <!--- korbi internal id:f5afdd6b-2cb8-4711-8753-a7483405a5b5 --> [](f5afdd6b-2cb8-4711-8753-a7483405a5b5) ########## superset-frontend/src/components/StreamingExportModal/useStreamingExport.ts: ########## @@ -0,0 +1,217 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import { useState, useCallback, useRef } from 'react'; +import { ExportStatus, StreamingProgress } from './StreamingExportModal'; + +interface UseStreamingExportOptions { + onComplete?: (downloadUrl: string, filename: string) => void; + onError?: (error: string) => void; +} + +interface StreamingExportParams { + url: string; + payload: any; + filename?: string; + exportType: 'csv' | 'xlsx'; + expectedRows?: number; +} + +export const useStreamingExport = (options: UseStreamingExportOptions = {}) => { + const [progress, setProgress] = useState<StreamingProgress>({ + rowsProcessed: 0, + totalRows: undefined, + totalSize: 0, + speed: 0, + mbPerSecond: 0, + elapsedTime: 0, + status: ExportStatus.STREAMING, + }); + const [isExporting, setIsExporting] = useState(false); + const abortControllerRef = useRef<AbortController | null>(null); + + const updateProgress = useCallback((updates: Partial<StreamingProgress>) => { + setProgress(prev => ({ ...prev, ...updates })); + }, []); + + const startExport = useCallback( + async ({ + url, + payload, + filename, + exportType, + expectedRows, + }: StreamingExportParams) => { + if (isExporting) return; + + setIsExporting(true); + abortControllerRef.current = new AbortController(); + + updateProgress({ + rowsProcessed: 0, + totalRows: expectedRows, + totalSize: 0, + speed: 0, + mbPerSecond: 0, + elapsedTime: 0, + status: ExportStatus.STREAMING, + filename, + }); + + try { + const response = await fetch(url, { + method: 'POST', + headers: { + 'Content-Type': 'application/x-www-form-urlencoded', + }, + body: new URLSearchParams({ + form_data: JSON.stringify(payload), + filename: filename || `export.${exportType}`, + expected_rows: expectedRows?.toString() || '', + }), + signal: abortControllerRef.current.signal, + credentials: 'same-origin', + }); + + if (!response.ok) { + throw new Error( + `Export failed: ${response.status} ${response.statusText}`, + ); + } + + if (!response.body) { + throw new Error('Response body is not available for streaming'); + } + + const reader = response.body.getReader(); + const chunks: Uint8Array[] = []; + let receivedLength = 0; + let rowsProcessed = 0; + const NEWLINE_BYTE = 10; // '\n' character code + + // eslint-disable-next-line no-constant-condition + while (true) { + // eslint-disable-next-line no-await-in-loop + const { done, value } = await reader.read(); + + if (done) break; + + if (abortControllerRef.current?.signal.aborted) { + throw new Error('Export cancelled by user'); + } + + chunks.push(value); + receivedLength += value.length; + + // Count newlines directly in binary (faster than decoding + regex) + let newlineCount = 0; + for (let i = 0; i < value.length; i += 1) { + if (value[i] === NEWLINE_BYTE) { + newlineCount += 1; + } + } + rowsProcessed += newlineCount; Review Comment: ### Inaccurate CSV row counting with quoted newlines <sub></sub> <details> <summary>Tell me more</summary> ###### What is the issue? Row counting logic assumes each newline represents a complete row, but CSV data can contain newlines within quoted fields, leading to inaccurate progress reporting. ###### Why this matters This will cause the progress indicator to show incorrect row counts when CSV data contains multi-line fields (e.g., text with embedded newlines in quotes), potentially confusing users about export progress and completion status. ###### Suggested change ∙ *Feature Preview* Implement proper CSV parsing to count actual rows instead of just newlines. Consider using a streaming CSV parser or maintain state to track whether we're inside quoted fields: ```typescript // Track if we're inside a quoted field let insideQuotes = false; let newlineCount = 0; for (let i = 0; i < value.length; i += 1) { const byte = value[i]; if (byte === 34) { // '"' character insideQuotes = !insideQuotes; } else if (byte === NEWLINE_BYTE && !insideQuotes) { newlineCount += 1; } } rowsProcessed += newlineCount; ``` ###### Provide feedback to improve future suggestions [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/3993c389-5e4b-439b-b4d5-1ca33329eabf/upvote) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/3993c389-5e4b-439b-b4d5-1ca33329eabf?what_not_true=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/3993c389-5e4b-439b-b4d5-1ca33329eabf?what_out_of_scope=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/3993c389-5e4b-439b-b4d5-1ca33329eabf?what_not_in_standard=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/3993c389-5e4b-439b-b4d5-1ca33329eabf) </details> <sub> 💬 Looking for more details? Reply to this comment to chat with Korbit. </sub> <!--- korbi internal id:45307a87-8d86-44b1-8f87-c8d4116697a5 --> [](45307a87-8d86-44b1-8f87-c8d4116697a5) ########## superset-frontend/src/components/StreamingExportModal/useStreamingExport.ts: ########## @@ -0,0 +1,217 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +import { useState, useCallback, useRef } from 'react'; +import { ExportStatus, StreamingProgress } from './StreamingExportModal'; + +interface UseStreamingExportOptions { + onComplete?: (downloadUrl: string, filename: string) => void; + onError?: (error: string) => void; +} + +interface StreamingExportParams { + url: string; + payload: any; + filename?: string; + exportType: 'csv' | 'xlsx'; + expectedRows?: number; +} Review Comment: ### Loose typing with 'any' <sub></sub> <details> <summary>Tell me more</summary> ###### What is the issue? The payload parameter is typed as 'any', which reduces type safety and makes the interface less predictable. ###### Why this matters Using 'any' bypasses TypeScript's type checking system, potentially leading to runtime errors that could have been caught during development. ###### Suggested change ∙ *Feature Preview* Define a specific type for the payload: ```typescript interface ExportPayload { // Define specific payload fields [key: string]: string | number | boolean | object; } interface StreamingExportParams { url: string; payload: ExportPayload; filename?: string; exportType: 'csv' | 'xlsx'; expectedRows?: number; } ``` ###### Provide feedback to improve future suggestions [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/e5468571-46f1-4382-8a00-6c44dd48ce35/upvote) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/e5468571-46f1-4382-8a00-6c44dd48ce35?what_not_true=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/e5468571-46f1-4382-8a00-6c44dd48ce35?what_out_of_scope=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/e5468571-46f1-4382-8a00-6c44dd48ce35?what_not_in_standard=true) [](https://app.korbit.ai/feedback/aa91ff46-6083-4491-9416-b83dd1994b51/e5468571-46f1-4382-8a00-6c44dd48ce35) </details> <sub> 💬 Looking for more details? Reply to this comment to chat with Korbit. </sub> <!--- korbi internal id:55bb78fb-3708-430a-b846-7fdebbdb30e5 --> [](55bb78fb-3708-430a-b846-7fdebbdb30e5) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
