It could be simpler and faster to use tagging of resources for billing:

https://docs.aws.amazon.com/emr/latest/ManagementGuide/emr-plan-tags-billing.html

That could also include other resources (eg s3).

> Am 12.12.2023 um 04:47 schrieb Jack Wells <jd.we...@gmail.com>:
> 
> 
> Hello Spark experts - I’m running Spark jobs in cluster mode using a 
> dedicated cluster for each job. Is there a way to see how much compute time 
> each job takes via Spark APIs, metrics, etc.? In case it makes a difference, 
> I’m using AWS EMR - I’d ultimately like to be able to say this job costs $X 
> since it took Y minutes on Z instance types (assuming all of the nodes are 
> the same instance type), but I figure I could probably need to get the Z 
> instance type through EMR APIs.
> 
> Thanks!
> Jack
> 

Reply via email to