Happened to me as well importing files (especially with DMT in 7.6.04), but 
most of the time is was due to database server being in Unicode and importing 
special characters (like accents) which take 2 bytes to be stored in database 
for example rather than 1 byte:
« Tête » needs 5 bytes in Unicode for example. So if you have a 4 char 
character field it wouldn’t fit (well if not « byte » type in dev studio in 
which case it « just » double the char length in database if memory serves).



> Le 30 avr. 2015 à 07:10, Jarl Grøneng <jarl.gron...@gmail.com> a écrit :
> 
> **
> 
> I have seen similar, but I have not have time to invesitgate it.
> 
> Importing approx 200 records from a csv file, and 4 of them failes with ARERR 
> 306.
> 
> --
> J
> 
> 2015-04-30 0:02 GMT+02:00 Andrew Hicox <and...@hicox.com 
> <mailto:and...@hicox.com>>:
> **
> Hi everyone,
> 
> I know this isn't strictly ARS, but I thought I'd ask here.
> 
> AI has me pulling my hair out, and at this rate, I'll be bald by friday.
> 
> I have a pretty basic job. It queries an ldap server and writes what it finds 
> into a landing form.
> 
> Because the data on the ldap server is pretty dirty, I'm using the "Strings 
> cut" node to truncate all the data so it'll fit in my fields.
> 
> And indeed this seems to work. Until it doesn't work.
> 
> The log shows that ARERR 306 is encountered,  because I've tried to set a 
> too-long value on a field and it gives me the field id.
> 
> Sure enough,  I am truncating the data mapped to that field to the length of 
> the field.
> 
> I think, "well maybe the indexing isn't really as advertised", so I truncate 
> the field to 1 less char than the max length of the field. No good. What the 
> hell, I go for 2 less. Still no good.
> 
> I insert a "write to log" right before the AROutput step just to verify, and 
> yes indeed,  there is not one value too long going into the AROutput.
> 
> Ok, so maybe some workflow on the server is doing it? Nope! Disabled all the 
> workflow, and just to be damn sure, I checked the "skip workflow processing" 
> check box on the AROutput node.
> 
> still throws the error.
> 
> Ok, just to be sure, turned on filter logging. Not a darn thing firing. The 
> arserver does not seem to be throwing the error!
> 
> Ok. In desperation, I set 0 length on the field in question. Boom, it works.
> 
> So after the job is done, I check to see what the longest value that got 
> wrote to that field was.
> 
> It is, exactly as it should be. Nothing longer than the max length set on 
> "Strings cut" ... which is to say, two characters less than the previous 
> length of the field.
> 
> What the hell?
> 
> Only thing I can think of is maybe there's some kinda garbage non-printable 
> ascii on the input that throws kettle's length detection for a loop? If so, I 
> don't really see any kind of charset conversion or anything I could use to 
> filter it.
> 
> I'm on 8.1.01 ... anyone ever run into anything like this before?
> 
> -Andy 
> _ARSlist: "Where the Answers Are" and have been for 20 years_
> 
> _ARSlist: "Where the Answers Are" and have been for 20 years_


_______________________________________________________________________________
UNSUBSCRIBE or access ARSlist Archives at www.arslist.org
"Where the Answers Are, and have been for 20 years"

Reply via email to