Sorry, I'm a bit of a n00b when it comes to atrium integrator.

Are the ARS functions like left(), etc available in spoon? As far as I
could tell, they aren't ... but I'm maybe not looking in the right place?

It seems like the error is thrown before the transaction is ever sent to
the arserver, though I could be wrong on that.

In any case ... the strangest thing about it is that you get the error if
the field's length is 50, even though the job truncates the value to 48
chars before ever sending it to the arserver.

If you change the field' s length to 0 it works. But then it never writes
anything longer than 48 chars to the now unlimited length field.

either I'm using the tool wrongly ... which I strongly suspect ... or maybe
this is a known bug ... but this seems pretty darn basic for nobody else to
have run into it yet.

-Andy
 On Apr 29, 2015 5:07 PM, "LJ LongWing" <lj.longw...@gmail.com> wrote:

> **
>
> Andy,
> You could try setting a file on the field to do a left() to see if that
> let's it through
> On Apr 29, 2015 4:02 PM, "Andrew Hicox" <and...@hicox.com> wrote:
>
>> **
>>
>> Hi everyone,
>>
>> I know this isn't strictly ARS, but I thought I'd ask here.
>>
>> AI has me pulling my hair out, and at this rate, I'll be bald by friday.
>>
>> I have a pretty basic job. It queries an ldap server and writes what it
>> finds into a landing form.
>>
>> Because the data on the ldap server is pretty dirty, I'm using the
>> "Strings cut" node to truncate all the data so it'll fit in my fields.
>>
>> And indeed this seems to work. Until it doesn't work.
>>
>> The log shows that ARERR 306 is encountered,  because I've tried to set a
>> too-long value on a field and it gives me the field id.
>>
>> Sure enough,  I am truncating the data mapped to that field to the length
>> of the field.
>>
>> I think, "well maybe the indexing isn't really as advertised", so I
>> truncate the field to 1 less char than the max length of the field. No
>> good. What the hell, I go for 2 less. Still no good.
>>
>> I insert a "write to log" right before the AROutput step just to verify,
>> and yes indeed,  there is not one value too long going into the AROutput.
>>
>> Ok, so maybe some workflow on the server is doing it? Nope! Disabled all
>> the workflow, and just to be damn sure, I checked the "skip workflow
>> processing" check box on the AROutput node.
>>
>> still throws the error.
>>
>> Ok, just to be sure, turned on filter logging. Not a darn thing firing.
>> The arserver does not seem to be throwing the error!
>>
>> Ok. In desperation, I set 0 length on the field in question. Boom, it
>> works.
>>
>> So after the job is done, I check to see what the longest value that got
>> wrote to that field was.
>>
>> It is, exactly as it should be. Nothing longer than the max length set on
>> "Strings cut" ... which is to say, two characters less than the previous
>> length of the field.
>>
>> What the hell?
>>
>> Only thing I can think of is maybe there's some kinda garbage
>> non-printable ascii on the input that throws kettle's length detection for
>> a loop? If so, I don't really see any kind of charset conversion or
>> anything I could use to filter it.
>>
>> I'm on 8.1.01 ... anyone ever run into anything like this before?
>>
>> -Andy
>>  _ARSlist: "Where the Answers Are" and have been for 20 years_
>
> _ARSlist: "Where the Answers Are" and have been for 20 years_

_______________________________________________________________________________
UNSUBSCRIBE or access ARSlist Archives at www.arslist.org
"Where the Answers Are, and have been for 20 years"

Reply via email to