I like this a lot. Currently, I don't think we could deliver it with
decent performance. In the future, we'll have the capability to do
this, I think.

In the meantime, you could always write a proxy to do it. That's
worked for a couple of mobile Twitter clients.

On Thu, Mar 12, 2009 at 22:27, atebits <loren.brich...@gmail.com> wrote:
>
> I'd like to second this request.  It'd be more than reasonable for the
> example above to count as 3 marks against the rate limit, the only
> difference would be that there's a single round-trip to the twitter
> servers rather than 3 (would save a bit of overhead on your end too).
> This makes a ton of sense for mobile apps.  Network latency is huge,
> but once a connection gets going it's best to transfer as much data in
> one "chunk" as possible.
>
> Not critical, just would be really nice to have :)
>
> Loren
>
> On Mar 12, 6:33 pm, Joshua Perry <j...@6bit.com> wrote:
>> Well, I accidentally sent that message so let me finish here...
>>
>> <batch_reply>
>>    <friends_timeline>
>>        <status>...</status>
>>        <status>...</status>
>>        <status>...</status>
>>        <status>...</status>
>>    </friends_timeline>
>>    <replies>
>>        <status>...</status>
>>        <status>...</status>
>>    </replies>
>>    <direct_messages>
>>        <direct_message>...</direct_message>
>>        <direct_message>...</direct_message>
>>        <direct_message>...</direct_message>
>>    </direct_messages>
>> </batch_reply>
>>
>> This would probably mostly benefit client applications as it would allow
>> us to consolidate 3 requests into one, allowing us to poll more often
>> for these pieces of data that people are interested in as being the most
>> real-time.
>>
>> I think this would be a great candidate for fast-track implementation as
>> it could sit on top of whatever internal code serves the existing feeds
>> just adding the wrapper before returning the data.  Also, it would be at
>> new URI so it would not be changing existing interfaces for current
>> applications.
>>
>> Josh
>>
>>
>>
>> Joshua Perry wrote:
>> > And now something perhaps a little more sane and do-able.
>>
>> > It would be very useful to have a batch request API that would allow
>> > requesting multiple datasets simultaneously.
>>
>> > Something like this:
>>
>> >http://twitter.com/batch_request.xml?friend_timeline_since_id=2345&re...
>>
>> > And this would return the data in the same format as the current REST
>> > api for the individual feeds but in a top-level container:
>>
>> > <batch_reply>
>> >    <friends_timeline>
>> >        <status>...</status>
>> >        <status>...</status>
>> >        <status>...</status>
>> >        <status>...</status>
>> >    </friends_timeline>
>> >    <direct_messages>
>> >        <direct_message>...</direct_message>
>> >        <direct_message>...</direct_message>
>> >        <direct_message>...</direct_message>
>> >    </direct_messages>
>> > </batch_reply>
>



-- 
Alex Payne - API Lead, Twitter, Inc.
http://twitter.com/al3x

Reply via email to