Re: limits changed?

2008-11-15 Thread Alex Payne

I've not changed that limit.  Email me off-list with your username and
I'll investigate.

Search API requests are not subject to the 100 request per hour limit, no.

On Fri, Nov 14, 2008 at 09:55, Waitman Gobble [EMAIL PROTECTED] wrote:

 Hi,

 It seems like the follow limit changed. It used to be 2,000 however
 today I can't seem to follow more than 1,580.

 Also, the Search API has a statement about no limits to search - are
 search requests disregarded when determining the 100 GET requests per
 hour?

 Thank you,

 Waitman




-- 
Alex Payne - API Lead, Twitter, Inc.
http://twitter.com/al3x


Re: Not pulling @replies in a search feed

2008-11-15 Thread Alex Payne

Do you mean any @replies or just @replies to a specific username?

On Fri, Nov 14, 2008 at 12:23, drupalot [EMAIL PROTECTED] wrote:

 I've been using advanced search to create feeds from the public
 timeline on specific keywords and so forth. One thing I'd like to do
 is not pull tweets that are @[username] replies, but not sure how. Is
 it possible to append an additional parameter to a feed URL to prevent
 pulling @[username] replies, just as its possible to append a
 parameter for language, keywords, negative keywords, etc?






-- 
Alex Payne - API Lead, Twitter, Inc.
http://twitter.com/al3x


Re: statuses/replies.xml and statuses/friends.xml return Not found

2008-11-15 Thread fastest963

Can you provide that account name that you are using to login? It may
be removed or suspended? If not then alex or someone will have to take
a look because it would be a server problem. In the past however, I
got this message when I tried to use an account that was suspended.


Re: invalid profile_image_url returned in JSON timeline

2008-11-15 Thread fastest963

Did you maybe copy something wrong...
the first link is 64498715 and the second is 64499571 (notice the last
3 digits).


Re: Not pulling @replies in a search feed

2008-11-15 Thread fastest963

No, what you would have to do is run a REGEX or similar search for @
[username] and if it returns false then process the data.
There is no way to omit results from the API standpoint.


Re: No OAuth Support just made Techmeme

2008-11-15 Thread TCI

Let me get this out of my head - I will never implement it and raise
my kids at the same time...
The way I would like this to work if for one to generate a key/
password for the application and specify what things it can do (can
read my followers but cannot change them, cannot read my email, etc) -
and make it revokable.
How about overlaying the API with another API? Some trustable entity
to store my Twitter password and do all this work of keys/Oauth/
whatever and provide the exact same API that Twitter does, so that
interested apps just need to switch to that API to provide the service
and no changes are needed.
Rafa

On Nov 14, 5:35 pm, Dossy Shiobara [EMAIL PROTECTED] wrote:
 Jesse Stay wrote:

  I'm okay with anything - OAuth or not, so long as we're not forced to
  store plain-text credentials.

 I just don't want a user's password.  A proxy (API key token, OAuth
 secret, whatever) is better, even if it doesn't afford any extra actual
 security.

 Users are educated over and over not to give up their password except to
 the site that it belongs to.  Undoing that by encouraging people to give
 up their Twitter password is just so frustrating.

 --
 Dossy Shiobara              | [EMAIL PROTECTED] |http://dossy.org/
 Panoptic Computer Network   |http://panoptic.com/
   He realized the fastest way to change is to laugh at your own
     folly -- then you can let go and quickly move on. (p. 70)


Re: invalid profile_image_url returned in JSON timeline

2008-11-15 Thread Alex Payne

It's possible that the cache of that URL hasn't expired since it was updated.

On Fri, Nov 14, 2008 at 08:35, Kevin Watters [EMAIL PROTECTED] wrote:

 In my friends_timeline.json for tweet 1005190499 I'm getting a
 profile_image_url value of

 http://s3.amazonaws.com/twitter_production/profile_images/64498715/rollins_narrowweb__300x460_0_normal.jpg

 -- which is a 404.

 The correct, working profile image URL that shows up on twitter.com is

 http://s3.amazonaws.com/twitter_production/profile_images/64499571/rollins_narrowweb__300x460_0_normal.jpg

 Just so you guys are aware :)




-- 
Alex Payne - API Lead, Twitter, Inc.
http://twitter.com/al3x


Re: Helpful PHP Optimizations

2008-11-15 Thread Ed Finkler

Good suggestions. The fastest connection to mysql (if you're using
mysql, of course) will likely be via the mysqli extension API, and
with the mysqlnd driver. PDO is quite lightweight, though, so it
probably isn't a big slowdown point for you. It will use the mysqlnd
driver if you have it installed.

http://php.net/manual/en/book.mysqli.php
http://forge.mysql.com/wiki/PHP_MYSQLND

If you're doing the same inserts with different data multiple times
per script execution, you definitely want to use prepared statements.
If the driver supports it natively, you should get a speed increase
because the db engine won't have to do an analyze/compile/optimize
step on every insert.

http://php.net/manual/en/pdo.prepared-statements.php

Also, if you have a set of inserts that will happen during a single
script execution, you might look at disabling auto-commit and using
transactions + a final commit at the end. This should speed up the
time spent inserting data considerably.

--
Ed Finkler
http://funkatron.com
AIM: funka7ron
ICQ: 3922133
Skype: funka7ron


On Sat, Nov 15, 2008 at 11:47 AM, fastest963 [EMAIL PROTECTED] wrote:

 PHP Optimizations

 PDO (http://www.php.net/pdo) is a must! Speeds up database
 transactions

 1) Store as much data as possible into database
 2) Store data as arrays for each updating and fetching (see serialize)
 3) Always use an index field in mysql
 4) Limit echos/prints

 Most Important: If connecting to pages (ex. API calls) use CURL Multi
 Connects
 (http://www.ibuildings.com/blog/archives/811-Multithreading-in-PHP-
 with-CURL.html) (use 2nd example)

 Also try to make use of Forking (http://immike.net/blog/2007/04/08/
 fork-php-and-speed-up-your-scripts/)

 MYSQL Optimizations

 1) Use BIGINT 255 unsigned fields for ids or anything that could get
 really large
 2) Remember OPTIMIZE


 Finally, if you are storing a lot of data. Do the following:
 Take the ID (index), hopefully not a number, and md5 it.
 Then store into a specific table based on the first letter of the md5.
 Example:

 $user = fastest963; //store user
 $let = md5($user); //md5 username
 SELECT * FROM.$let[0].users WHERE `username` = '$user' ; //$let[0]
 is the first letter of the md5

 Tables would be created like this:
 First make 0users table. Then copy and make 1-F tables.
 0users, 1users, 2users, 3users, 4users, 5users, 6users, 7users,
 8users, 9users, ausers, busers, cusers, dusers, eusers, fusers
 Those would be all the table names, thus splitting your data per table
 by 16.


 I hope that helps people out when optimizing their scripts. These have
 allowed me to process 2000+ tweets per sec via PHP. Any questions?
 Feel free to comment!

 Thanks,
 James Hartig