Outlook Connector Power Automate
Manish Solanki  

How to convert CSV file to array of JSON objects?

Use Case

Today, we will learn how to convert csv file to array of JSON object without using ‘Apply to each’ as it not the optimum action when using a large csv file.

Scenario & Sample data

We will take an example of an incoming email with csv file as attachment. The flow will convert the attached csv file to an array of JSON objects.

Input csv file email attachment looks like:

Index,Customer Id,First Name,Last Name,Company,City
1,DD37Cf93aecA6Dc,Sheryl,Baxter,Rasmussen Group,East Leonard
2,1Ef7b82A4CAAD10,Preston,Lozano,Vega-Gentry,East Jimmychester
3,6F94879bDAfE5a6,Roy,Berry,Murillo-Perry,Isabelborough
4,5Cef8BFA16c5e3c,Linda,Olsen,"Dominguez, Mcmillan and Donovan",Bensonview
5,053d585Ab6b3159,Joanna,Bender,"Martin, Lang and Andrade",West Priscilla
6,2d08FB17EE273F4,Aimee,Downs,Steele Group,Chavezborough
7,EA4d384DfDbBf77,Darren,Peck,"Lester, Woodard and Mitchell",Lake Ana
8,0e04AFde9f225dE,Brett,Mullen,"Sanford, Davenport and Giles",Kimport
9,C2dE4dEEc489ae0,Sheryl,Meyers,Browning-Simon,Robersonstad
10,8C2811a503C7c5a,Michelle,Gallagher,Beck-Hendrix,Elaineberg

There are 6 fields in the header row of input csv file: Index, Customer Id, First Name, Last Name, Company, City

Solution

1. Create an automate cloud flow. Select “When a new email arrives (V3)” as the trigger. When you expand the trigger, you could set various parameters to set. For this example, set both “Include Attachments” & “Only with Attachments” parameters as “Yes”. Here, I have also applied filter on the subject, but you could set other parameters as per your need:

2. Next, add “Apply to each” action & pass ‘Attachments’ from the trigger to iterate those. Inside that block, add compose action & rename it to ‘CSV data’ that stores the content of an attachment. To get the content of the attachment, expression needs to be added. Every expression is added in the expression box as highlighted in the below screenshot:

base64ToString(items('Apply_to_each')?['contentBytes'])

2.1. Now, add another compose action just below the ‘CSV data’ action inside the apply to each block. Rename it as ‘CSV array’. Write an expression inside CSV array compose action to create a simple array of csv data using replace, split function. The chunk function has been used to create array of each data row (including the header row):

chunk(split(replace(outputs('CSV_data'),decodeUriComponent('%0A'),','),','),6)

The key point here is the second parameter '6' of chunk function. This will be decided based on the number of fields in input csv file. As in this example, the input csv file has 6 columns, so we have chosen the value as '6'.

2.2. To extract the header row from the array, add another compose action & rename it as ‘CSV header’. Here, also writes an expression for extracting the header row from the array (the first element of CSV array action):

take(outputs('CSV_array'),1)

2.3. Next, add “Select” action to create the final array of JSON objects. To get only the data rows, expression needs to be added in “From” parameter. Similarly, to get the field or column name and its corresponding value, we need to write the expressions for each key & value in “Map” parameter.

Expression used for “From” parameter:

skip(outputs('CSV_array'),1)

Expression used for each key & value in “Map” parameters:

KeyValue
first(outputs(‘CSV_header’))[0]item()?[0]
first(outputs(‘CSV_header’))[1]item()?[1]
first(outputs(‘CSV_header’))[2]item()?[2]
first(outputs(‘CSV_header’))[3]item()?[3]
first(outputs(‘CSV_header’))[4]item()?[4]
first(outputs(‘CSV_header’))[5]item()?[5]

Using expressions, we are fetching the field or column name from the header row saved in ‘CSV header’ compose action. The corresponding values can also be fetched from each row using index as each data row is itself an array.

Output

The output of Select action is the required array of JSON objects which will look like:

Conclusion

We have used expressions & Select action to perform the conversion which is effective way in scenarios where we have large csv to convert, and we wish to avoid adding loop for the conversion.

Leave A Comment