UDF output

2008-05-05 Thread Rares Vernica

Hello,

If you use "printf" or some similar function inside a UDF function, 
where does the output goes? Or, how can I make such an output go somewhere?


Thanks,
Ray


--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]



create cursor from dynamic sql

2008-05-05 Thread Bryan Cantwell
I need to create a cursor using dynamic sql. Below is a short example of
what I am trying to use in order to get the correct resultset... ignore
the rest of the proc, I just need to know how to make the cursor
declaration from @vsql

Thanks

DELIMITER $$

DROP PROCEDURE IF EXISTS `firescope`.`triggerDownTime` $$
CREATE PROCEDURE `firescope`.`triggerDownTime` (IN TrigID BIGINT(20)
unsigned,IN tRange varchar(10), OUT DownTime BIGINT(20) unsigned)
BEGIN

DECLARE done INT DEFAULT 0;

  DECLARE eventTime, startTime, endTime, defaultStart, defaultEnd,
outage BIGINT(20) unsigned;
  DECLARE val, alreadyStarted int(10);

  set @vsql = 'select unix_timestamp(DATE_SUB(now(),INTERVAL
',tRange,')), unix_timestamp(now()), value, unix_timestamp(time_stamp)
from fs_sl_events where triggerid = 'TrigID,' AND time_stamp >
DATE_SUB(now(),INTERVAL 'tRange,')');

  DECLARE cur1 CURSOR FOR vsql;

  DECLARE CONTINUE HANDLER FOR NOT FOUND SET done = 1;

  OPEN cur1;

  REPEAT

FETCH cur1 INTO defaultStart, defaultEnd, val, eventTime;

IF NOT done THEN

   set DownTime = unix_timestamp(DATE_SUB(now(),INTERVAL 30
DAY));

END IF;

  UNTIL done END REPEAT;

  CLOSE cur1;
select DownTime;

END $$

DELIMITER ;

--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]



RE: Import of a mysldump file fails

2008-05-05 Thread Rolando Edwards
In this case, the command for the second suggestion is

gzip -d < slavesetup.sql.gz | mysql -u  --password= 

-Original Message-
From: Eramo, Mark [mailto:[EMAIL PROTECTED]
Sent: Monday, May 05, 2008 3:40 PM
To: Mysql
Cc: Kieran Kelleher
Subject: RE: Import of a mysldump file fails

Hi Kieran,
   Try the following 2 things...

1) Add this to your my.cnf / my.ini in the [mysqld] section

max_allowed_packet=32M

(you might have to set this value higher based on your existing database).


2) If the import still does not work, try it like this as well.

mysql -u  --password=  < file_to_import

These should help you get by this issue. I use both of these to import a MySQL 
4.0.20 dump into MySQL 5.0

Mark


-Original Message-
From: Kieran Kelleher [mailto:[EMAIL PROTECTED]
Sent: Monday, May 05, 2008 3:21 PM
To: Mysql
Subject: Import of a mysldump file fails

Hi,

I have a weird one going on here. I have done this many times before
with MySQL 4.1, but now that I am trying to accomplish same task with
a MySQL 5.0.x setup, mysql is just not succeeding in importing the
mysqldump file.

I am doing a full mysql dump from a mysql master running 5.0.51a on a
OS X 10.4.x Server PowerPC G4 XServe. The total size of all databases
is around 17GB. I compress on the fly with gzip to create a 2.4GB
compressed dump using this command the largest database is all
innodb and the 2nd largest is myisam... so this is mixed innodb and
myisam server.

$ mysqldump -u root -p --all-databases --extended-insert --
add-locks --lock-all-tables --flush-logs --master-data=1 --add-drop-
table --create-options --quick --set-charset --disable-keys --quote-
names | gzip > slavesetup.sql.gz &

I am then copying this compressed dump to a MacMini Intel (aka slave)
which has a clean install of 5.0.51b and importing into the mysql
server. The import ran for a while and I then notice at some stage (by
looking at top or "show processlist" that importing has completed)
However checking the databases, only some of them have been imported
(about 6GB of data) and the import has mysteriously stopped in the
middle of importing the largest database. The import command is
straightforward...

$ gunzip < slavesetup.sql.gz | mysql -u root -h localhost -p &

The *.err files inside the data dir have nothing to indicate what the
problem is.

The error I get is in terminal is:
ERROR 2013 (HY000) at line 570: Lost connection to MySQL server during
query

Any clues as to what might be going on here and why this might be
happening? Any alternative or workaround suggestions?

Regards, Kieran

--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]


--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]



RE: Import of a mysldump file fails

2008-05-05 Thread Eramo, Mark
Hi Kieran,
   Try the following 2 things...

1) Add this to your my.cnf / my.ini in the [mysqld] section

max_allowed_packet=32M

(you might have to set this value higher based on your existing database).


2) If the import still does not work, try it like this as well.

mysql -u  --password=  < file_to_import

These should help you get by this issue. I use both of these to import a MySQL 
4.0.20 dump into MySQL 5.0

Mark


-Original Message-
From: Kieran Kelleher [mailto:[EMAIL PROTECTED]
Sent: Monday, May 05, 2008 3:21 PM
To: Mysql
Subject: Import of a mysldump file fails

Hi,

I have a weird one going on here. I have done this many times before
with MySQL 4.1, but now that I am trying to accomplish same task with
a MySQL 5.0.x setup, mysql is just not succeeding in importing the
mysqldump file.

I am doing a full mysql dump from a mysql master running 5.0.51a on a
OS X 10.4.x Server PowerPC G4 XServe. The total size of all databases
is around 17GB. I compress on the fly with gzip to create a 2.4GB
compressed dump using this command the largest database is all
innodb and the 2nd largest is myisam... so this is mixed innodb and
myisam server.

$ mysqldump -u root -p --all-databases --extended-insert --
add-locks --lock-all-tables --flush-logs --master-data=1 --add-drop-
table --create-options --quick --set-charset --disable-keys --quote-
names | gzip > slavesetup.sql.gz &

I am then copying this compressed dump to a MacMini Intel (aka slave)
which has a clean install of 5.0.51b and importing into the mysql
server. The import ran for a while and I then notice at some stage (by
looking at top or "show processlist" that importing has completed)
However checking the databases, only some of them have been imported
(about 6GB of data) and the import has mysteriously stopped in the
middle of importing the largest database. The import command is
straightforward...

$ gunzip < slavesetup.sql.gz | mysql -u root -h localhost -p &

The *.err files inside the data dir have nothing to indicate what the
problem is.

The error I get is in terminal is:
ERROR 2013 (HY000) at line 570: Lost connection to MySQL server during
query

Any clues as to what might be going on here and why this might be
happening? Any alternative or workaround suggestions?

Regards, Kieran

--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]



Import of a mysldump file fails

2008-05-05 Thread Kieran Kelleher

Hi,

I have a weird one going on here. I have done this many times before  
with MySQL 4.1, but now that I am trying to accomplish same task with  
a MySQL 5.0.x setup, mysql is just not succeeding in importing the  
mysqldump file.


I am doing a full mysql dump from a mysql master running 5.0.51a on a  
OS X 10.4.x Server PowerPC G4 XServe. The total size of all databases  
is around 17GB. I compress on the fly with gzip to create a 2.4GB  
compressed dump using this command the largest database is all  
innodb and the 2nd largest is myisam... so this is mixed innodb and  
myisam server.


$ mysqldump -u root -p --all-databases --extended-insert -- 
add-locks --lock-all-tables --flush-logs --master-data=1 --add-drop- 
table --create-options --quick --set-charset --disable-keys --quote- 
names | gzip > slavesetup.sql.gz &


I am then copying this compressed dump to a MacMini Intel (aka slave)  
which has a clean install of 5.0.51b and importing into the mysql  
server. The import ran for a while and I then notice at some stage (by  
looking at top or "show processlist" that importing has completed)  
However checking the databases, only some of them have been imported  
(about 6GB of data) and the import has mysteriously stopped in the  
middle of importing the largest database. The import command is  
straightforward...


$ gunzip < slavesetup.sql.gz | mysql -u root -h localhost -p &

The *.err files inside the data dir have nothing to indicate what the  
problem is.


The error I get is in terminal is:
ERROR 2013 (HY000) at line 570: Lost connection to MySQL server during  
query


Any clues as to what might be going on here and why this might be  
happening? Any alternative or workaround suggestions?


Regards, Kieran

ANN: Data Wizard for MySQL 8.4 released

2008-05-05 Thread SQL Maestro Group

Hi!

SQL Maestro Group announces the release of Data Wizard for MySQL 8.4, a 
powerful Windows GUI solution for MySQL data management.


Data Wizard for MySQL provides you with a number of easy-to-use wizards to 
generate PHP and ASP.NET scripts for the selected tables, views and queries, 
convert any ADO-compatible database to the MySQL database, export data from 
MySQL tables, views and queries to most popular formats, and import data 
into the tables.


http://www.sqlmaestro.com/products/mysql/datawizard/

New features
=

1. Now it is possible to create a system scheduled task or convert a task 
created in Data wizard to system one. Such tasks can be executed by the 
Windows scheduler service even without user logon.


2. Command line options have been implemented.

3. Since this version each task is displayed as a separated icon in the 
system tray when executing.


4. A possibility of connecting to remote MySQL databases via HTTP tunneling 
has been implemented.


5. PHP Generator: a lot of new features. See full press-release for details.

6. Data Pump: now you can execute custom SQL scripts after connecting, 
before data importing and after pumping.


7. ASP.NET generator: since this version it is possible to specify a custom 
connection string.


There are also some other useful things. Full press release is available at
http://www.sqlmaestro.com/news/company/5148/

Background information:
---
SQL Maestro Group is engaged in developing complete database admin and 
management tools for MySQL, Oracle, MS SQL Server, DB2, PostgreSQL, SQLite, 
Firebird and MaxDB providing the highest performance, scalability and 
reliability to meet the requirements of today's database applications.


Sincerely yours,
The SQL Maestro Group Team
http://www.sqlmaestro.com


--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]



Re: why does left join gives more results?

2008-05-05 Thread Brent Baisley
If you are getting more results, I would guess that your users table
is not a list of unique users, but a list of user logins. If that is
the case, then it's your LEFT JOIN and the count(*) that is causing
you to get more results.
If a user logs in 5 times, but only has 1 post, you will get 5 records
for that user out of the LEFT JOIN. That's the way left joins work.
You're then doing a count on all the records, but you really just want
a count of the number of posts.
If all my assumptions are correct, then a quick fix for your query is
to change your count(*) to this:
count(DISTINCT posts.post_id) as counted

That will count the number of unique posts. I don't know what your
unique field name is for the posts table.

Brent Baisley
Systems Architect

On Sat, May 3, 2008 at 9:00 PM, Patrick Aljord <[EMAIL PROTECTED]> wrote:
> hey all,
>
>  I have my query that counts posts per user:
>
>  SELECT count(*) as counted, c.user_id FROM posts c  group by c.user_id
>  having counted>1 order by counted DESC LIMIT 20
>
>  I wanted to add user login for each count so I did:
>
>  SELECT count(*) as counted, u.login FROM posts c left join users u on
>  posts.poster_id=u.id  group by c.user_id having counted>1 order by
>  counted DESC LIMIT 20
>
>  but now I get more results.
>
>  Any idea what I'm doing wrong?
>
>  Thanks in advance
>
>  Pat
>
>  --
>  MySQL General Mailing List
>  For list archives: http://lists.mysql.com/mysql
>  To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
>
>

-- 
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]