When you don't always have non-null values in both series, and you are
interpolating nulls, then I think you effectively want to compute the
average between the non-null value and the interpolated value of nearest
non-null values.  Google Charts doesn't provide a way to give you the
interpolated values, so you'll have to do your own interpolation just to
compute the average.

To compute your average linear trendline as the average of the separately
computed trendlines, as you asked originally, there is no built-in way to
do that. You would need to get the equation of the trendlines, which we
also don't provide yet, and compute a pair of sample points for each
trendline, average those values, and use that to compute your own average
trendline.

Might be easier to compute the average series since you only need to do
some linear interpolation.   Hope that helps.

On Sun, Jul 26, 2015 at 12:21 PM, Kostas Poulakidas <[email protected]>
wrote:

> Hello Daniel,
>
> thanks for yor response,
>
> 1) I want to show both Series and trendline (no reason to hide it)
>
> 2) In my previous example calculating the average is easy But what can I
> do in an example like this? https://jsfiddle.net/gpogxs9b/
>



-- 
Daniel LaLiberte <https://plus.google.com/100631381223468223275?prsrc=2>  -
978-394-1058
[email protected] <[email protected]>   5CC, Cambridge MA
[email protected] <[email protected]> 9 Juniper Ridge
Road, Acton MA

-- 
You received this message because you are subscribed to the Google Groups 
"Google Visualization API" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/google-visualization-api.
For more options, visit https://groups.google.com/d/optout.

Reply via email to