[PHP-DB] cPanel db Creator script

2007-07-20 Thread Paul Smith
I hope I am doing this write as this is the first time I've sent a message to a 
mailing list!

I have a php script that uses cPanel to create a database.  It requires the 
cURL to be installed and my webhost has confirmed to me that it is installed.

I have not been able to get the script to work and so emailed my webhost for 
help.  Their reply was "You may need to configure your script to use PHP's 
built-in cURL function in order to get this to work instead of the CLI cURL."

Now, I am not well experienced yet but am learning fast.  This is the first 
time I've come across cURL so I am at a complete loss and wondered if someone 
could help me with this.  Here's the script I have.

http://$cpanel_user:[EMAIL 
PROTECTED]:2082/frontend/$cpanel_skin/sql/adddb.html?db=$db_name'");

// output result

echo $ch;

}

else {

echo "Usage: cpanel_create_db.php?db=$db_name";

}

?>

Thanks in advance to anyone who has a clue how to solve this!!!

Paul.


RE: [PHP-DB] Friday losing it: JOINS

2007-07-20 Thread Instruct ICC

From: "Instruct ICC" <[EMAIL PROTECTED]>
After a quick break in the john, I'm thinking perhaps table2 is not holding 
unique "part"s as I was told it would.


Sanity restored.  This is the case.

_
http://imagine-windowslive.com/hotmail/?locale=en-us&ocid=TXT_TAGHM_migration_HM_mini_pcmag_0507

--
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



[PHP-DB] Friday losing it: JOINS

2007-07-20 Thread Instruct ICC
I have a 1 table query which returns less rows than when I have a 2 table 
query using an implicit inner join or an explicit left join.


Select some, fields

From table1

Where part like '123%'
/* returns 2225 rows */

Select activePart, some, fields

From table1, table2

Where part like '123%'
and part = activePart
/* implicit inner join returns 2270 rows */

Select activePart, some, fields

From table1 left join table2

On part = activePart
Where part like '123%'
/* explicit left join also returns 2270 rows */

Fields part and activePart are not in table2 and table1 respectively.

I assume I am forgetting something basic this friday.  Why would I get more 
rows from the above?  I might expect more rows from an outer join.  Am I 
writing an outer join?


After a quick break in the john, I'm thinking perhaps table2 is not holding 
unique "part"s as I was told it would.  But I'll post this for your thoughts 
before I check on that, in case you have other insights.


_
http://imagine-windowslive.com/hotmail/?locale=en-us&ocid=TXT_TAGHM_migration_HM_mini_2G_0507

--
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP-DB] Slooooow query in MySQL.

2007-07-20 Thread Stut

Chris wrote:

Rob Adams wrote:
I have a query that I run using mysql that returns about 60,000 plus 
rows. It's been so large that I've just been testing it with a limit 
0, 1 (ten thousand) on the query.  That used to take about 10 
minutes to run, including processing time in PHP which spits out xml 
from the query.  I decided to chunk the query down into 1,000 row 
increments, and tried that. The script processed 10,000 rows in 23 
seconds!  I was amazed!  But unfortunately it takes quite a bit longer 
than 6*23 to process the 60,000 rows that way (1,000 at a time).  It 
takes almost 8 minutes.  I can't figure out why it takes so long, or 
how to make it faster.  The data for 60,000 rows is about 120mb, so I 
would prefer not to use a temporary table.  Any other suggestions?  
This is probably more a db issue than a php issue, but I thought I'd 
try here first.


Sounds like missing indexes or something.

Use explain: http://dev.mysql.com/doc/refman/4.1/en/explain.html


If that were the case I wouldn't expect limiting the number of rows 
returned to make a difference since the actual query is the same.


Chances are it's purely a data transfer delay. Do a test with the same 
query but only grab one of the fields - something relative small like a 
integer field - and see if that's significantly quicker. I'm betting it 
will be.


If that is the problem you need to be looking at making sure you're only 
getting the fields you need. You may also want to look into changing the 
cursor type you're using although I'm not sure if that's possible with 
MySQL nevermind how to do it.


-Stut

--
http://stut.net/

--
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP-DB] Slooooow query in MySQL.

2007-07-20 Thread Aleksandar Vojnovic
60k records shouldn't be a problem. Show us the query you're making and 
the table structure.


 OKi98 wrote:

Rob Adams napsal(a):
I have a query that I run using mysql that returns about 60,000 plus 
rows. It's been so large that I've just been testing it with a limit 
0, 1 (ten thousand) on the query.  That used to take about 10 
minutes to run, including processing time in PHP which spits out xml 
from the query.  I decided to chunk the query down into 1,000 row 
increments, and tried that. The script processed 10,000 rows in 23 
seconds!  I was amazed!  But unfortunately it takes quite a bit 
longer than 6*23 to process the 60,000 rows that way (1,000 at a 
time).  It takes almost 8 minutes.  I can't figure out why it takes 
so long, or how to make it faster.  The data for 60,000 rows is about 
120mb, so I would prefer not to use a temporary table.  Any other 
suggestions?  This is probably more a db issue than a php issue, but 
I thought I'd try here first.
60k rows is not that much, I have tables with 500k rows and queries 
are running smoothly.


Anyway we cannot help you if you do not post:
1. "show create table"
2. result of "explain query"
3. the query itself

OKi98



--
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP-DB] Slooooow query in MySQL.

2007-07-20 Thread OKi98

Rob Adams napsal(a):
I have a query that I run using mysql that returns about 60,000 plus 
rows. It's been so large that I've just been testing it with a limit 
0, 1 (ten thousand) on the query.  That used to take about 10 
minutes to run, including processing time in PHP which spits out xml 
from the query.  I decided to chunk the query down into 1,000 row 
increments, and tried that. The script processed 10,000 rows in 23 
seconds!  I was amazed!  But unfortunately it takes quite a bit longer 
than 6*23 to process the 60,000 rows that way (1,000 at a time).  It 
takes almost 8 minutes.  I can't figure out why it takes so long, or 
how to make it faster.  The data for 60,000 rows is about 120mb, so I 
would prefer not to use a temporary table.  Any other suggestions?  
This is probably more a db issue than a php issue, but I thought I'd 
try here first.
60k rows is not that much, I have tables with 500k rows and queries are 
running smoothly.


Anyway we cannot help you if you do not post:
1. "show create table"
2. result of "explain query"
3. the query itself

OKi98

--
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php