Element click goes through all pages, then closes window without scraping anything on Wayback Machine results

Hi, I'm trying to scrape for all the available links for a site backed up on the Wayback Machine on archive.org. Once I type in my site into the wayback, I go to the far right of the results page and choose the option to list the urls. The results are then listed 50 per page.

I have link selectors for the listed links and a element click for the next page button set to "Click More".

What I expect: It goes through all the pages, collects links for each page as it goes, then scrapes those links.

What happens instead: It goes through all the pages, then promptly closes the window. No scraped data is returned. Occasionally, if I tell it to scrape again, it may decide to scrape the last page and return that data. If it does decide to do this, the scraper will continue to start at the last page and it will only scrape that last page. To get it to scrape from the first page again, I have to clear my data and close and reopen Chrome.

I'm pretty sure I encountered a bug (possibly two?). I've seen posts on here that detail similar problems but with no reply, so I really hope this doesn't get ignored.

Web Scraper version: 0.6.4
Chrome version: Version 98.0.4758.82 (Official Build) (64-bit)
OS: Windows 11 Home (64-bit)

Sitemap:
{"_id":"waybackScrape","startUrl":["Wayback Machine/twitter.com/covid19tracking/"],"selectors":[{"id":"BaseLinks","parentSelectors":["_root"],"type":"SelectorLink","selector":".url a","multiple":true,"delay":0},{"id":"subLink","parentSelectors":["BaseLinks"],"type":"SelectorLink","selector":".captures-range-info a","multiple":true,"delay":0},{"id":"nextButton","parentSelectors":["_root"],"type":"SelectorElementClick","clickElementSelector":".next a","clickElementUniquenessType":"uniqueCSSSelector","clickType":"clickMore","delay":3000,"discardInitialElements":"do-not-discard","multiple":true,"selector":"td.url"}]}