Can't scrape past page 2 on ajax site

Hi,

I'm trying to scrape approximately 60 pages of data from a site. Each page contains about 20 links, I click each link to enter that page, scrape the data, click to the next page, repeat. Because the site paginates with ajax, the url does not change, so I use element click selectors to click "next". However, whenever I try to recursively nest paginate it fails. If I un-nest it, I can get to page 2 before the script quits. How can I recursively click next through every page? I feel like I'm so close to the answer, but I can't get it.

Url: http://a-r-a.org/membership/find-a-member/

Sitemap:
{
"_id": "automotive_recyclers_association",
"startUrl": ["http://a-r-a.org/membership/find-a-member/"],
"selectors": [{
"id": "links",
"type": "SelectorLink",
"selector": "div.upme-field-name a",
"parentSelectors": ["_root", "pagination"],
"multiple": true,
"delay": "10000"
}, {
"id": "name",
"type": "SelectorText",
"selector": "div.upme-field.upme-first_name div.upme-field-value span",
"parentSelectors": ["links"],
"multiple": false,
"regex": "",
"delay": "1000"
}, {
"id": "company",
"type": "SelectorText",
"selector": "div.upme-field.upme-company_name div.upme-field-value span",
"parentSelectors": ["links"],
"multiple": false,
"regex": "",
"delay": 0
}, {
"id": "contact_title",
"type": "SelectorText",
"selector": "div.upme-field.upme-contact_title div.upme-field-value span",
"parentSelectors": ["links"],
"multiple": false,
"regex": "",
"delay": 0
}, {
"id": "address_1",
"type": "SelectorText",
"selector": "div.upme-field.upme-billing_address_1 div.upme-field-value span",
"parentSelectors": ["links"],
"multiple": false,
"regex": "",
"delay": 0
}, {
"id": "address_2",
"type": "SelectorText",
"selector": "div.upme-field.upme-billing_address_2 div.upme-field-value span",
"parentSelectors": ["links"],
"multiple": false,
"regex": "",
"delay": 0
}, {
"id": "city",
"type": "SelectorText",
"selector": "div.upme-field.upme-billing_city div.upme-field-value span",
"parentSelectors": ["links"],
"multiple": false,
"regex": "",
"delay": 0
}, {
"id": "state",
"type": "SelectorText",
"selector": "div.upme-field.upme-billing_state div.upme-field-value span",
"parentSelectors": ["links"],
"multiple": false,
"regex": "",
"delay": 0
}, {
"id": "zip",
"type": "SelectorText",
"selector": "div.upme-field.upme-billing_postcode div.upme-field-value span",
"parentSelectors": ["links"],
"multiple": false,
"regex": "",
"delay": 0
}, {
"id": "country",
"type": "SelectorText",
"selector": "div.upme-field.upme-billing_country div.upme-field-value span",
"parentSelectors": ["links"],
"multiple": false,
"regex": "",
"delay": 0
}, {
"id": "email",
"type": "SelectorText",
"selector": "div.upme-field.upme-user_email div.upme-field-value span",
"parentSelectors": ["links"],
"multiple": false,
"regex": "",
"delay": 0
}, {
"id": "website",
"type": "SelectorText",
"selector": "div.upme-field-value a",
"parentSelectors": ["links"],
"multiple": false,
"regex": "",
"delay": 0
}, {
"id": "phone",
"type": "SelectorText",
"selector": "div.upme-field.upme-billing_phone div.upme-field-value span",
"parentSelectors": ["links"],
"multiple": false,
"regex": "",
"delay": 0
}, {
"id": "fax",
"type": "SelectorText",
"selector": "div.upme-field.upme-billing_fax div.upme-field-value span",
"parentSelectors": ["links"],
"multiple": false,
"regex": "",
"delay": 0
}, {
"id": "phone_2",
"type": "SelectorText",
"selector": "div.upme-field.upme-billing_phone2 div.upme-field-value span",
"parentSelectors": ["links"],
"multiple": false,
"regex": "",
"delay": 0
}, {
"id": "pagination",
"type": "SelectorElementClick",
"selector": "div.upme-navi",
"parentSelectors": ["_root"],
"multiple": true,
"delay": "12000",
"clickElementSelector": ".upme-navi a:contains('next')",
"clickType": "clickMore",
"discardInitialElements": false,
"clickElementUniquenessType": "uniqueText"
}]
}

Hi,

the button "Next" displays : javascript:void(0);

It sounds no possible for a link and that is why Web scraper can't follow you.

I think you will never obtain a good pagination for this web site. Sorry