Hi Dilip,
I got two binary logs in Server. I don't know how to find server uptime?
mysql> show master logs;
+--+
| Log_name |
+--+
| localhost-bin.08 |
| localhost-bin.09 |
+--+
2 rows in set (0.00 sec)
if i ask for
Hi,
please tell me the server uptime and also the master logs as
show master logs;
in mysql prompt.
Yes u can restore data from the binlog if you have the binlogs.
balaraju mandala wrote:
Hi Dilip,
it means i loosed the data, correct Dilip. is there any other way to gain
that data, any bina
Hi Dilip,
it means i loosed the data, correct Dilip. is there any other way to gain
that data, any binary logs etc?
regards,
bala
With JDBC using a PreparedStatement and #setInputStream I have not found
"max_allowed_packet > N" to be required. With a max_allowed_packet of
say 16M, I still have been able to insert LOB data as large as 300MB.
R.
-Original Message-
From: Rick James [mailto:[EMAIL PROTECTED]
Sent: Mon
Hi List,
I want to convert strings to proper-case,
where only the 1st char of each word is uppercase,
such as: "This Is An Example."
Any idea how to do this with MySQL 5.0.15 ?
Thanks, Cor
I would run this query:
SELECT
*
FROM mytable
WHERE LOWER(emailaddress) IN
(SELECT
LOWER(emailaddress)
FROM mytable
GROUP BY 1
HAVING COUNT(emailaddress) > 1)
This would show all duplicate emails, I would use the info this displays
to choose which records to change/keep/dele
On Sunday, May 07, 2006 6:14 PM, Greg 'groggy' Lehey wrote:
> On Sunday, 7 May 2006 at 9:27:31 -0700, Robert DiFalco wrote:
>> What are people doing for backups on very large MySQL/InnoDB
>> databases? Say for databases greater than 200 GB. Curious about
>> the backup methods, procedures, and f
I just noticed that a key field (emailaddress) in my db is case
sensitive when it should not have been, so now I've got a bunch of
what are effectively duplicate records. I'm having trouble picking
them out so I can manually merge/delete them before changing the
collation on the field to be
Hi Luke,
When mysql is doing a dump, do the updates that happen during the dump
get included in the dump.
I assume you mean 'mysqldump'.
I have a dump that starts at 11pm and goes for 2 hours. If someone
updates data at say 11:45pm, does that update get included in the dump?
When does the
Hi Johan,
I need to extract some data to a textfile from a big database.
If I try to do like this:
mysql < queryfile.sql > outfile.txt
"outfile.txt" it looks something like:
"OrderID", "Quant", "OrdrDate", "code1", "code2"...
10021, 12, 20060412, 23, 95...
10022, 5, 20060412, , 75...
But, I
Related inequalities:
Given a blob of N bytes:
max_allowed_packet > N
innodb_log_file_size > 10 * N (if InnoDB)
And maybe issues with
bulk_insert_buffer_size
innodb_log_buffer_size
> -Original Message-
> From: Jeremy Cole [mailto:[EMAIL PROTECTED]
> Sent: Monday, May 08, 2006 2:5
Hi Robert,
Anyone know for sure if the memory needed to insert a LOB is a
percentage of the system's available memory or if it is allocated from
the innodb_buffer_pool_size? IOW, how should my configuration settings
be modified to allow the insertion of larger blobs? :)
The majority of the mem
Hello David,
Thanks for the response. I don't know which version yet. I just
started a month ago with the company and am just starting on this
project. I will find out. The site has not been updated in 5 years
though... so the Unidata database must be at least 6 years old.
Brett
At 06:5
Hi Brett,
Which version of Unidata? I doubt very much if the migration toolkit would
assist with this.
You will probably have to re-normalise the data due to the multi-value
aspects of the Unidata/Universe database. This would probably require the
addition of several more tables to cope (dependen
Hi Greg,
Maybe similar features to that of bacula (my current backup software of
choice for my wifes business servers). This is a very comprehensive open
source solution that has many of the features requested below. eg.
multiple servers, pooling, aging etc. It is a good example of what my
own req
tony yau wrote:
Hi John,
tried your suggestion but I can't get it to work. This is because I don't
know how to set conditions in the following clauses (because there isn't
any)
and Table1.[condition for Changes1]
and Table2.[condition for Changes2]
and Table3.[condition for Changes3]
What va
> -Original Message-
> From: Daniel da Veiga [mailto:[EMAIL PROTECTED]
> Sent: Monday, May 08, 2006 1:55 PM
> To: mysql@lists.mysql.com
> Subject: Re: Backups with MySQL/InnoDB
>
> On 5/8/06, David Hillman <[EMAIL PROTECTED]> wrote:
> > On May 7, 2006, at 11:29 PM, Robert DiFalco wrote:
It sounds like you need to either synchronize access to 'the connection'
(one user at a time), or have a connection per request. In the latter case,
obtaining a connection from a pool of connections makes sense.
Unfortunately, I have only done this with Java -- not with .NET. I would be
surprised
What i meant by implementing connection pooling i meant if i need to do any
code changes other than changes in connection string.
Thanks,
Romy
On 5/8/06, Tim Lucia <[EMAIL PROTECTED]> wrote:
I don't hear "you need to implement connection pooling". Maybe, but I
think
you might still have err
On 5/9/06, Alex <[EMAIL PROTECTED]> wrote:
That's what I actually did now. We have got the "databases start with
usernames + number appended" situation here so i patched sql_show.cc
code to only do acl checks on databases starting with the username.
Still not optimal but cuts down a show databa
That's what I actually did now. We have got the "databases start with
usernames + number appended" situation here so i patched sql_show.cc
code to only do acl checks on databases starting with the username.
Still not optimal but cuts down a show databases on a server with 60.000
databases from
romyd misc said:
> Hi Everyone,
>
> I'm developing an application using C# .NET and mysql as database. It's a
> multithreaded application, we open a mysql database connection at the very
> beginning when the application is started and all the database requests
> use
> the same connection. But under
I don't hear "you need to implement connection pooling". Maybe, but I think
you might still have errors under load, as you approach the maximum
connection count in the pool.
Tim
-Original Message-
From: romyd misc [mailto:[EMAIL PROTECTED]
Sent: Monday, May 08, 2006 2:37 PM
To: mysql@l
Hi John,
tried your suggestion but I can't get it to work. This is because I don't
know how to set conditions in the following clauses (because there isn't
any)
> > and Table1.[condition for Changes1]
> > and Table2.[condition for Changes2]
> > and Table3.[condition for Changes3]
the result I've
Has anyone converted from Unidata db to Mysql? How easy/difficult
is it to do? Does the Mysql Migration toolkit help with that process?
an old consulting company setup a website with Unidata and perl... we
want to convert to mysql...
Thanks!
--
--
On 5/8/06, David Hillman <[EMAIL PROTECTED]> wrote:
On May 7, 2006, at 11:29 PM, Robert DiFalco wrote:
> Fast, incremental, compressed, and no max-size limitations. Must be
> transaction safe; able to run while transactions are going on without
> including any started after the backup began; the
Hi Everyone,
I'm developing an application using C# .NET and mysql as database. It's a
multithreaded application, we open a mysql database connection at the very
beginning when the application is started and all the database requests use
the same connection. But under stress or when more than one
On May 7, 2006, at 11:29 PM, Robert DiFalco wrote:
Fast, incremental, compressed, and no max-size limitations. Must be
transaction safe; able to run while transactions are going on without
including any started after the backup began; the usual stuff.
Incremental, transaction safe, compresse
Dear Friends,
I have a database with approximately 10 tables with about 1 lakh records
each in 3 tables, I need to know that on what factors does the speed of
mysql depends,
1)Does a table having records effects the speed of data fetch of another
table in the same database.
2)Whats the approximate
> I need the birthdays from yesterday, today and the next 4 or 5
birthdays.
You don;t need to manually compute every date component. Try something
like ...
SELECT ...
WHERE DATE_SUB(NOW(),INTERVAL 1 DAY) <= mem.birthday
AND DATE_ADD(NOW(),INTERVAL 5 DAY) >= mem.birthday
ORDER BY mem.bir
Hi,
Where should I increase max_allowed packet??
I get a error from Windows (yes, I know... it's running on a M$-os, not
my bad - not my desicion).
The results is about 2 - 10 Gb of data.
Regards,
/Johan
Dilipkumar wrote:
Hi,
Increase max_allowed packet to 1.5 gb and then try to import you
Hi George,
To do it incrementally is not really an option, since i have to run it
as a script during a short time-frame every night, and theres simply not
time to process the files.
The outfile is about 2 - 10 Gb every time.
The tables have about 100 - 180 columns, and to do a COALESCE would
Hi,
Yes it can be repaired using
myisamchk -u root -p datadirectory the table name as tablename.*
password
this will check the data and also the index file also.
Payne wrote:
hi, I got a table where the myi isn't able to re be read. I tried to
run myisam but it give an error about the inde
Johan,
have you thought about doing this incrementally?
ie - 25% at a time x 4
to show something for NULL, you can use the
COALESCE function.
ie - COALESCE(column,'nothing')
--
George Law
VoIP Network Developer
864-678-3161
[EMAIL PROTECTED]
MSN: [EMAIL PROTECTED]
-Original Message-
Hi,
If you have deleted .MYD files then truncate the table and restore it
from the backup if yu have.
MYD means your precious data which contains.
balaraju mandala wrote:
Dear Comunity,
I need your help. I accidently deleted some '.MYD' files. I want to
restore
them, without stopping the
Hi,
Increase max_allowed packet to 1.5 gb and then try to import your data .
eg ;
In mysql prompt run the file as
*use database
\. /tmp/filename.txt
*
Johan Lundqvist wrote:
Hi,
I need to extract some data to a textfile from a big database.
If I try to do like this:
mysql < queryfile.
Hi,
I need to extract some data to a textfile from a big database.
If I try to do like this:
mysql < queryfile.sql > outfile.txt
"outfile.txt" it looks something like:
"OrderID", "Quant", "OrdrDate", "code1", "code2"...
10021, 12, 20060412, 23, 95...
10022, 5, 20060412, , 75...
But, I never g
Dear Comunity,
I need your help. I accidently deleted some '.MYD' files. I want to restore
them, without stopping the running server. how i can do this. i am using
Linux OS, i tried to create file using ---> vi tablename.MYD(a blank file)
but it is not accepted by MySql.
regards,
bala
Hey, i´ve a problem with getting the next and the actual birthdays.
This my actual birthday sql :
SELECT SQL_CACHE birthday,mem.lname, mem.fname,mem.mem_id FROM members mem
INNER JOIN network net ON (net.mem_id = mem.mem_id AND net.frd_id =1)
WHERE
(( DAYOFYEAR(FROM_UNIXTIME( mem.birthday )) < D
Is there any way to optimize a range query that includes
an ORDER BY with keys from two different tables? I'm running
MySQL 4.1.18 on FreeBSD.
I've been struggling with some queries that are _incredibly_
slow--from 1-5 minutes on slowish but decent hardware. When I
try versions without the ORDER
sheeri kritzer schrieb:
If your server has log-warnings set to ON, you can check the error
logs, and use a script to count how many times for each host, in a
row, this happens.
+---+---+
| Variable_name | Value |
+---+---+
| log_warnings | 1 |
I did not tur
Hi,
I'm aware of the fact that this is a 32 bit system - and I've tried to make
sure that mysqld will not use more than 4 GB. As you can see the
innodb_buffer_pool_size is 2 Gb and the total amount of memory used by the
MyISAM key buffer size and the per thread variables is less then 2 GB. The
Thanks for the info. I checked the links you suggested, but it doesn't
look like they address my problem. All the discussion write about
croatian colaltion in latin2, but I'm interested in utf-8 charset. For
utf-8 there's no croatian collation. There is one in slovenian
(utf8_slovenian_ci) that
jehova villa martinez schrieb:
Hi, as newbie and with some trubles understanding English language, i
have a question that I don’t know howto put on search engines (I don’t
know technical keywords for my particular case). This is why I post here.
This is the whole picture:
I have four programs
44 matches
Mail list logo