THANKS Peter!
It works!

On Apr 16, 12:12 pm, Peter Denton <petermden...@gmail.com> wrote:
> and you could bump up the while loop to 15, to get 1500 results, like...
>
> while ($page_num <= 15 )
>
> notivce in the search URL rpp=100 (results per page) and page=$page_num
> (pagination), so you can get 1500 results
>
> On Wed, Apr 15, 2009 at 8:08 PM, Peter Denton <petermden...@gmail.com>wrote:
>
>
>
> > <?php
> >      $page_num = 1;
> >      $txtString = "";
> >      while ($page_num <= 2 )
> >      {
> >          $host = "
> >http://search.twitter.com/search.atom?q=japan&max_id=1529989226&rpp=1...
> > ";
> >          $result = file_get_contents($host);
> >          $xml = new SimpleXMLElement($result);
> >          foreach ($xml->entry as $entry)
> >          {
> >           $statusUpdate[] = $entry->title;
> >          }
> >        $page_num++;
> >      }
> >      foreach($statusUpdate as $su)
> >      {
> >        $txtString .= $su;
> >      }
>
> >      $myFile = "myTextFile.txt";
> >      $fh = fopen($myFile, 'w') or die("can't open file");
> >      fwrite($fh, $stringData);
> >      $stringData = $txtString;
> >      fwrite($fh, $stringData);
> >      fclose($fh);
> > ?>
>
> > On Wed, Apr 15, 2009 at 6:11 PM, Bill <william...@gmail.com> wrote:
>
> >> Hi. Thanks again. I see:
>
> >>  <link type="application/atom+xml" rel="next" href="http://
> >> search.twitter.com/search.atom?
> >> max_id=1529989226&amp;page=2&amp;q=japan<http://search.twitter.com/search.atom?%0Amax_id=1529989226&page=2&q=j...>
> >> "/>
>
> >> but how do I change that and what do I change it to in the url:
>
> >>http://search.twitter.com/search.atom?q=japan
>
> >> is there some part of the url that I can adjust to get more results?
>
> >> Bill
>
> >> On Apr 15, 11:39 pm, Matt Sanford <m...@twitter.com> wrote:
> >> > You could usehttp://search.twitter.com/search.atom?q=japananda
> >> > small bit of scripting. Look in those results for a link rel="next"
> >> > for the next page. That will let you page your way back to ~1500 tweets.
>
> >> > Thanks;
> >> >    — Matt Sanford / @mzsanford
>
> >> > On Apr 15, 2009, at 06:14 AM, Bill wrote:
>
> >> > > Hi Thanks very much. Actually though it seems that they only can make
> >> > > a pdf with my own tweets. I was hoping to get just tweets that contain
> >> > > the word 'japan' but thanks anyway.
>
> >> > > On Apr 15, 9:58 pm, Abraham Williams <4bra...@gmail.com> wrote:
> >> > >> Not quite what you are looking buthttp://tweetbook.in/oauthletsyou
> >> > >> export
> >> > >> to pdf.
>
> >> > >> On Wed, Apr 15, 2009 at 07:14, Bill <william...@gmail.com> wrote:
>
> >> > >>> Hi. Can any suggest the easiest way to get a text file of say 2000
> >> > >>> tweets that contain the word 'Japan' in them?
> >> > >>> Thanks.
>
> >> > >> --
> >> > >> Abraham Williams |http://the.hackerconundrum.com
> >> > >> Hacker |http://abrah.am|http://twitter.com/abraham
> >> > >> Web608 | Community Evangelist |http://web608.org
> >> > >> This email is: [ ] blogable [x] ask first [ ] private.
> >> > >> Sent from Madison, Wisconsin, United States
>
> > --
> > Peter M. Denton
> >www.twibs.com
> > i...@twibs.com
>
> > Twibs makes Top 20 apps on Twitter -http://tinyurl.com/bopu6c
>
> --
> Peter M. Dentonwww.twibs.com
> i...@twibs.com
>
> Twibs makes Top 20 apps on Twitter -http://tinyurl.com/bopu6c

Reply via email to