Duplicate rows in the scrape export

I'm getting duplicate data and can't figure out why.

{"_id":"ks-100k-1m","startUrl":["https://www.kickstarter.com/discover/advanced?state=live&pledged=3&sort=popularity&seed=2591390"],"selectors":[{"id":"load_more","type":"SelectorElementClick","parentSelectors":["_root"],"selector":"div.black","multiple":true,"delay":"3000","clickElementSelector":"div.load_more","clickType":"clickMore","discardInitialElements":"do-not-discard","clickElementUniquenessType":"uniqueText"},{"id":"project","type":"SelectorText","parentSelectors":["load_more"],"selector":"h3.type-18","multiple":false,"regex":"","delay":""},{"id":"user","type":"SelectorText","parentSelectors":["load_more"],"selector":"div.inline-block a.soft-black","multiple":false,"regex":"","delay":""},{"id":"percent-funded","type":"SelectorText","parentSelectors":["load_more"],"selector":"div.ksr-green-700 div.type-13:nth-of-type(2) span:nth-of-type(1)","multiple":false,"regex":"","delay":""},{"id":"days-left","type":"SelectorText","parentSelectors":["load_more"],"selector":"span.js-num","multiple":false,"regex":"","delay":""},{"id":"category","type":"SelectorText","parentSelectors":["load_more"],"selector":"a.dark-grey-500:nth-of-type(1)","multiple":false,"regex":"","delay":""},{"id":"url","type":"SelectorElementAttribute","parentSelectors":["load_more"],"selector":"a.block","multiple":false,"extractAttribute":"href","delay":""},{"id":"funding","type":"SelectorText","parentSelectors":["load_more"],"selector":"div.type-13:nth-of-type(1) span:nth-of-type(1)","multiple":false,"regex":"","delay":""}]}

Hi,

you can maybe try this sitemap, add things you need.
Sitemap:
{"_id":"duplicate","startUrl":["https://www.kickstarter.com/discover/advanced?state=live&pledged=3&sort=popularity&seed=2591390"],"selectors":[{"id":"load_more","type":"SelectorElementClick","parentSelectors":["_root"],"selector":"div.black","multiple":true,"delay":"3000","clickElementSelector":"div.load_more","clickType":"clickMore","discardInitialElements":"do-not-discard","clickElementUniquenessType":"uniqueText"},{"id":"project-name","type":"SelectorLink","parentSelectors":["load_more"],"selector":"a.mb3","multiple":false,"delay":0},{"id":"name","type":"SelectorText","parentSelectors":["project-name"],"selector":".grid-col-10 h2, .relative a.hero__link","multiple":false,"regex":"","delay":0},{"id":"creator","type":"SelectorText","parentSelectors":["project-name"],"selector":"div.text-left.type-16","multiple":false,"regex":"","delay":0},{"id":"funded","type":"SelectorText","parentSelectors":["project-name"],"selector":".grid-col-4-md span.ksr-green-500","multiple":false,"regex":"","delay":0},{"id":"days-left","type":"SelectorText","parentSelectors":["project-name"],"selector":".grid-col-4-md span.block.type-16","multiple":false,"regex":"","delay":0},{"id":"category","type":"SelectorText","parentSelectors":["project-name"],"selector":",.hide a:nth-of-type(2) .ml1 span","multiple":false,"regex":"","delay":0}]}

I found a fix that still used my original sitemap that was a lot faster in collecting the data as compared to the suggestion above. The duplicate problem was caused by the element click selector. I changed discard initial elements from never discard to discard when click elements exist and the problem was solved.