Need a help. We are uploading the datafile into a General List. The code is supposed to be having unique values. But due to user error, we end up having duplicate values in the code column, causing issues in downstream process.
Is there anyway, we can do a check for duplicates during data file upload? If not, can you suggest some alternative solution?
It's always best to fix data issues at the source only, if that's still an option...
Otherwise, follow these steps: 1. Create a text formatted line item with the values that are supposed to be unique. Ex: In your case, if the code was supposed to be unique and repeating twice, use the code line item 2. Create a boolean formatted line item called Unique with the formula: ISFIRSTOCCURRENCE(Code, List Name of the module)
The 'Unique' will identify each row of data and will not contain duplicates
So @anirudh is correct on the first occurrence piece, but I am wondering how you are loading your list to have multiple "codes" which effect the downstream process. Can you show us the list you load the data to?
thanks @Kanishq17 , as @anirudh mentioned, I used the ISFIRSTOCCURRENCE to find out the duplicates on the particular column. This column may be blank also, inthat case ISFIRSTOCCURRENCE flags them as duplicate which may not be an ideal solution, is there any way, ISFIRSTOCCURRENCE considers only when there is data in the column and do the check for duplicates?
Please see the attached screenshot. Opsdb config is the column where we are checking for the duplicates.