Also,

A very simple way to insert a token into an already formed JSON is with a
replace in string transform.
Form your JSON as desired with a placeholder for the ##TOKEN## then use the
'replace in string' transform to replace the ##TOKEN## with the token value.

In the next few hours I'll probably come up with a few other ways I've used
in the past 10 years using Kettle and Hop.

I won't post them but move to github and place some high level description
of exactly what you want to achieve and I'll try and help you.


On Fri, 6 Jun 2025 at 09:51, [email protected] <[email protected]> wrote:

> Hey,
>
> I'm doing just that at present.
>
> There are many ways to do this.
>
> For example, you could:
>
>
>    1. You could use one pipeline to authenticate, get the response, parse
>    it and either set it as a variable or pass it to an executor pipeline for
>    further processing.
>    2. Although it's better to have loosely coupled logic, if you want to
>    process everything in the same pipeline you can tick off the 'Do not pass
>    field downstream' flag and your original stream will be kept as a field.
>    3. You absolutely use ase many JSON input fields as you want. I
>    actually prefer to parse in several layers as it gives you a bit more
>    control and you tend to work faster.
>       1. Example 1:
>       Let's assume that you have a Logistics API and an ERP API and you
>       are making them talk to each other. You would call  one of the api's to 
> get
>       a response for an 'order'. Once you have the response, in the first
>       instance to parse the order details + the product array. Then on a 
> second
>       step parse the product details + the pallet array, then the pallet 
> array.
>       once you have all your data parsed then you can build the JSON in the
>       format of the other api.
>       2. Example 2
>       You have a webservice that receives a call to execute a process. In
>       it there is a token valid for a few seconds to interact with other 
> systems.
>       You can parse the token, letting through the original response. then 
> parse
>       the variables required for the nextprocess/call. Execute what you need 
> with
>       the next system and then come back to the webservice and provide your
>       output to the webservice.
>
>
> Both examples are actual use cases: one is an industrial process, the
> other a financial reporting solution.
>
> May I suggest you use the GIT discussions next as although things are sort
> of integrated, answers on the email list are not reflected in github.
> This results in a poor way of spreading knowledge.
>
> Diego
>
>
> On Thu, 5 Jun 2025 at 00:07, <[email protected]> wrote:
>
>> Hello,
>>
>>
>>
>> One of the rest service that I need to call needs a bearer token.
>>
>>
>>
>> It seems Apache Hop doesn’t support directly this, but a token can be
>> generated via a curl call.
>>
>>
>>
>> I could implement that but my problem is once I have created the new
>> Bearer token I need to have a Json Input transform to read the new
>> AccessToken value.
>>
>> Actually the problem is that I have already a Json input Transform that
>> read the business data (Input data that I will provide to the rest) and I
>> have also another Json Input transform to read the new Access Token (each
>> row might have a new access token).
>>
>>
>>
>> You can’ have two Json Inputs within the same stream and I can’t simply
>> join both stream because no common field.
>>
>>
>>
>> Do you know how I can combine the two streams together (First Json File,
>> containing multiple rows with the second Json file that will be created for
>> each rows and will contain only the new Access Token)
>>
>>
>>
>> Thank you in advance for your help
>>
>> Michel
>>
>>
>>
>

Reply via email to