dosubot[bot] commented on issue #12066:
URL: https://github.com/apache/superset/issues/12066#issuecomment-2917313780

   <!-- Answer -->
   The error "keys must be str, int, float, bool or None, not tuple" when using 
multiple "Group by" entries in the Time-Series Table is a known and persistent 
issue, confirmed in Superset 4.1.1 and earlier versions. The root cause is in 
the backend logic: when multiple columns are used for "group by", the resulting 
pandas DataFrame pivot operation can produce dictionary keys as tuples 
(representing multi-indexes), which are not valid for JSON serialization and 
thus trigger this error.
   
   The relevant code is in the TimeTableViz class in superset/viz.py. When 
"groupby" is set, the get_data method pivots the DataFrame and returns 
pt.to_dict(orient="index"). If the index or columns are multi-level (tuples), 
this leads to the serialization error you see. The frontend transformProps.ts 
expects flat keys and does not handle tuple keys either. The backend also 
restricts "group by" to a single metric, but the error can still occur with 
multiple groupby columns and a single metric if the resulting DataFrame index 
is a tuple rather than a string or int 
([source](https://github.com/apache/superset/blob/master/superset/viz.py)).
   
   There is currently no documented fix or workaround in the latest supported 
versions. If you need to use multiple group by columns in a time-series table, 
you may need to flatten the resulting multi-index in the backend before 
serialization, or use a different visualization that supports this use case. If 
you are comfortable with code, reviewing and updating the get_data method in 
TimeTableViz to flatten tuple keys to strings before returning the dictionary 
would address the serialization issue.
   
   No recent changes or fixes for this issue were found in the repository or 
documentation. If you need a workaround, consider using a Table chart instead 
of Time-Series Table, as it handles group by columns differently and does not 
rely on the same pivot logic 
([source](https://github.com/apache/superset/blob/master/superset-frontend/cypress-base/cypress/e2e/explore/visualizations/table.test.ts)).
   
   
   <!-- Dosu Comment Footer -->
   *To reply, just mention my name [@dosu](https://go.dosu.dev/dosubot).*
   
   ---
   <sup>How did I do? 
[Good](https://app.dosu.dev/response-feedback/76779a83-aca3-4c23-b45a-0c13be0c0107?feedback_type=great_response)
 | 
[Irrelevant](https://app.dosu.dev/response-feedback/76779a83-aca3-4c23-b45a-0c13be0c0107?feedback_type=irrelevant_answer)
 | 
[Incorrect](https://app.dosu.dev/response-feedback/76779a83-aca3-4c23-b45a-0c13be0c0107?feedback_type=incorrect_sources)
 | 
[Verbose](https://app.dosu.dev/response-feedback/76779a83-aca3-4c23-b45a-0c13be0c0107?feedback_type=too_verbose)
 | 
[Hallucination](https://app.dosu.dev/response-feedback/76779a83-aca3-4c23-b45a-0c13be0c0107?feedback_type=hallucination)
 | [Report 
🐛](https://app.dosu.dev/response-feedback/76779a83-aca3-4c23-b45a-0c13be0c0107?feedback_type=bug_report)
 | 
[Other](https://app.dosu.dev/response-feedback/76779a83-aca3-4c23-b45a-0c13be0c0107?feedback_type=other)</sup>&nbsp;&nbsp;[![Join
 
Discord](https://img.shields.io/badge/join-5865F2?logo=discord&logoColor=white&label=)](https://go.dosu.dev/discord-bot)&nbsp;[!
 [Share on 
X](https://img.shields.io/badge/X-share-black)](https://twitter.com/intent/tweet?text=%40dosu_ai%20helped%20me%20solve%20this%20issue!&url=https%3A//github.com/apache/superset/issues/12066)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscr...@superset.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscr...@superset.apache.org
For additional commands, e-mail: notifications-h...@superset.apache.org

Reply via email to