Working with Concur API to Close Batches and Retrieve Report Files using KingswaySoft Components

28 June 2022
KingswaySoft Team

SAP Concur is a widely popular service for travel, expense, and invoice management, with many use cases in which you would need to read data from or write data to Concur to facilitate some business process automation. In doing so, you would essentially work with SAP Concur API to extract or write data using HTTP-based requests and responses. To initiate such API service calls, you would have to properly construct the requests in your ETL process. The requests can vary based on the type of action you wish to perform, and the construction of service calls will be dynamic depending on the service call and all the necessary input data. From a business perspective, one helpful use case would be to find and close batches based on processed batch numbers, and subsequently, download a set of report files and store the data in a database system. Now, this not only contains sending out a bunch of requests, but also would need to reference back and forth for values, and parse out details from within the responses, then read the files, and eventually archive them. We will show you how these can be done easily and in an automated way using KingswaySoft components.

For the use case that we are discussing in this blog post, we will be using the following KingswaySoft components (which can be found as a part of our SSIS Productivity Pack):

  • HTTP Connection Manager
  • JSON Source component 
  • Premium Derived Column component
  • HTTP Requester component
  • Premium Flat File Source component
  • Premium ADO.NET component

And the below out of the box components

  • Conditional Split component
  • Recordset Destination component

Package Design

The overall design would be something as shown below. The data flow, GetBatchNumberFromBatchURL, makes a service call to SAP Concur to retrieve all Batch numbers from the Concur expense service endpoint. Inside the <i>foreach</i> loop container, the data flow CloseBatchForBatchNumber closes each batch by looping through the batch numbers received from the initial data flow task. And the Get File data flow would download the report (file), which is then parsed and stored in a database using the LoadFileDataToTable data flow. Finally, once the loop ends, the Premium File Transfer Task archives the files to a location.

Control Flow design.png

Now let's break down each task to show how they work internally.

Data flow - GetBatchNumberFromBatchURL

In this data flow, a JSON Source component is used to send a GET request to the expense/payment endpoint in Concur API. Please note that a HTTP Connection manager is used to authenticate to the API, and you can find more details on how to work with authentication and OAuth2 in another one of our blog posts (link here). Now, the overall design of this data flow would look like the one below:

GetBatchNumberFromBatchURL.png

In the JSON Source component, The HTTP method has been chosen as GET, and we have provided the relative path for the <i>paymentbatch</i> endpoint.

JSON Source GET batch URL.png

Please note that if you have more than one page of batches, you could set up the "request again rule" that we have in our JSON Source component to paginate to other pages as well. We have a blog post that explains about pagination, which you can find here. Now, once you have the general page configured, you would need to get the output columns configured. In order to do that, the document design for the response would need to be imported into the "Document Designer" page.

JSON Source GET batch URL - Document Design.png

As you can see, there is batch information like Batch-URL, BatchTotal, etc., which we will use in the downstream processing. The BatchTotal decides whether a batch is valid or not. Therefore, we would need to take only those batches that have a total greater than zero, and to do this, we would use a conditional split component.

Filter Batch totals less that zero.png

The output from here would be passed to a Premium Derived Column Component, in which you could provide an expression to get the BatchNumber out of the Batch-Url (in the response).

FetchBatchNumber.png

The following expression is what you would be using in this case: GetTokenAtPosition( [Batch-Url],"https://www.concursolutions.com:443/api/expense/paymentbatch/v1.1/batch/" ,2 )

What this does is, it splits the URL into two parts, and gets the second part, which is the batch number that is concatenated at the end of the URL. Once you have the batch number, you would need to save it to an object variable, @[User::Recordset] We can do this by using a RecordSet Destination component. Once we have this setup, we move on to the next part.

Data Flow - CloseBatchForBatchNumber

Drag and drop a foreach loop container, and connect it in series to the previous data flow (GetBatchNumberFromBatchURL), and inside this, we build the current data flow. Inside this data flow task, the design would look as shown below.

CloseBatchesBasedOnBatchNumber.pn

The JSON Source component is used to send a POST request to close the batches, based on the batch numbers that we received and saved in the previous data flow. The trick here would be to pass the batch numbers one by one to the JSON Source. We do this by using the foreach loop container, parsing through the object variable @[User::Recordset] and assigning the values in each iteration to a string variable @[User::BatchNumber]. In the JSON Source, we receive the values by parameterizing the component to accept the variable @[User::BatchNumber] in the relative path, as shown below.

JSON SOurce Close Batches.png

Since this is a Source component, it requires a Destination in SSIS, so we need to provide a Dummy Destination component in the downstream data flow. Now we move on to the next part. 

Data flow - Get File

At this point, Concur API requires an additional POST request to be sent, very similar to the Close Batch request, but with slight modifications to download the report file. In order to do that, once again we use a JSON Source inside the same foreach loop (since we need to do this for each batch number) and then receive a file URL in the response, which we use to download the file. The design would look as the one below.

Get File.png

The JSON Source relative path for this data flow would look like the one below. You would notice that it looks similar to closing a batch, but the relative path is slightly different. The reason for this is due to how the API endpoint requires an additional request to get the file URL.

Additional JSON to get file URL.png

Once we have the relative path set up, we can set the document designer, which would parse out the response.

File URL document Design.png

With this designed, we now add a Premium Derived Column Component, in which we provide a similar expression to get the specific File URL parsed out from the "File-Url" node received, as this contains some additional values in it. 

fetch File URL using Premium Derived Column.png

Now, use the same Premium Derived Column Component, or another one, to set a path for the file to be saved. We do this because, with the FileURL that we have fetched out, we will be sending a file download request to the API using an HTTP Requester component for which a file download path and a file name are required. In our example, we are planning to create files and suffix a <i>datetime</i> tag to them. Therefore, we use the below expression in the new Derived column called "FilePath".

"D:\\MyFiles\\Source\\Result_"+DateTimeToString(GetDate(), "yyyy_MM_dd_hh_mm_ss" )+".txt"

Once we have that, we drag and drop an HTTP Requester component and configure it as shown below.

HTTP REquester to download the file.png

As you can see, the Relative Path is set as the FileURL which we had formed in the Derived Column, and the connection manager remains the same as below. This is where a relative path comes in handy, as you could use the same connection manager with the Base URL, which the relative path can change in each of the components that we use. Now, since our use case here has another task, which is to read each file and write the records to a database table, we would just add another Derived column to write the FilePath to a variable @[User::FilePathVar]. The expression below can be used for it.

WriteValueToVariable( @[User::FilePathVar], [FilePath] )

We close the data flow design with a Dummy Destination and move on to the next one.

Data Flow - LoadFileDataToTable

Here, we use a parameterized Premium Flat File Source component, where the Source File Path property is parameterized to accept the variable @[User::FilePathVar]

LoadFileDataToTable.png

The parameterization can be done by clicking on the "fx" button which opens up the property expression editor. Note that the variable contains a dummy value, which is for initialization to not have any validation error. And at runtime, it picks up the actual expression values.

Premium Flat File parameterization.pn

A Premium ADO.NET Destination component is used to write to your desired database table. Once we have configured the above, we move out of the <i>foreach</i> loop container, and then drag and drop a Premium File Transfer Task which will move the set of files created to an archive location.

Premium Flat File parameterization.png

Note that this is outside the <i>foreach</i> loop because we need to wait until all the files have been downloaded and written to the database table, for all the batch numbers that are fetched. 

Conclusion

We hope that the above use case has served as an example of how to work the logic pertaining to using Concur API and other similar ones. Depending on which endpoint you wish to work with, you can add or remove data flows to and from the loop container and make changes in the design to alter the logic to suit the requirement. A sample package that we have used above can be downloaded here, which you could open to find the component configurations in more detail. Please note that the sample package is just for reference purposes, and we suggest you closely review the design to suit your needs if you would like to put this into production use. We hope this has helped.

Archive

December 2024 1 November 2024 3 October 2024 1 September 2024 1 August 2024 2 July 2024 1 June 2024 1 May 2024 1 April 2024 2 March 2024 2 February 2024 2 January 2024 2 December 2023 1 November 2023 1 October 2023 2 August 2023 1 July 2023 2 June 2023 1 May 2023 2 April 2023 1 March 2023 1 February 2023 1 January 2023 2 December 2022 1 November 2022 2 October 2022 2 September 2022 2 August 2022 2 July 2022 3 June 2022 2 May 2022 2 April 2022 3 March 2022 2 February 2022 1 January 2022 2 December 2021 1 October 2021 1 September 2021 2 August 2021 2 July 2021 2 June 2021 1 May 2021 1 April 2021 2 March 2021 2 February 2021 2 January 2021 2 December 2020 2 November 2020 4 October 2020 1 September 2020 3 August 2020 2 July 2020 1 June 2020 2 May 2020 1 April 2020 1 March 2020 1 February 2020 1 January 2020 1 December 2019 1 November 2019 1 October 2019 1 May 2019 1 February 2019 1 December 2018 2 November 2018 1 October 2018 4 September 2018 1 August 2018 1 July 2018 1 June 2018 3 April 2018 3 March 2018 3 February 2018 3 January 2018 2 December 2017 1 April 2017 1 March 2017 7 December 2016 1 November 2016 2 October 2016 1 September 2016 4 August 2016 1 June 2016 1 May 2016 3 April 2016 1 August 2015 1 April 2015 10 August 2014 1 July 2014 1 June 2014 2 May 2014 2 February 2014 1 January 2014 2 October 2013 1 September 2013 2 August 2013 2 June 2013 5 May 2013 2 March 2013 1 February 2013 1 January 2013 1 December 2012 2 November 2012 2 September 2012 2 July 2012 1 May 2012 3 April 2012 2 March 2012 2 January 2012 1

Tags