Possible to omit "web-scraper-order" and "web-scraper-start-url" columns for cleaner CSV output, and keep original table row order?

Hi, Currently the CSV output automatically includes these 2 columns of "booking-keeping" data:

(a) web-scraper-order, and
(b) web-scraper-start-url

(1) Is there a way to skip output for these 2 columns? The reason is that I want to have a clean and simple output for clients who update their scraping on their own, with my customized sitemap on a daily basis.

Here's a sitemap for a table from W3schools.com:

{"_id":"test_table","startUrl":["https://www.w3schools.com/html/html_tables.asp"],"selectors":[{"id":"table","type":"SelectorTable","selector":"div.w3-example table","parentSelectors":["_root"],"multiple":true,"columns":[{"header":"Company","name":"Company","extract":true},{"header":"Contact","name":"Contact","extract":true},{"header":"Country","name":"Country","extract":true}],"delay":"1000","tableDataRowSelector":"tr:nth-of-type(n+2)","tableHeaderRowSelector":"tr:nth-of-type(1)"}]}

(2) In a related issue, is there a way to keep the resulting rows in the original order? They currently come out all re-ordered.

Original order:
Alfreds Futterkiste
Centro comercial Moctezuma
...

Thanks in advance!

HI!

If you want your results to be properly ordered, you have to use CouchDB instance on your machine.

Please refer to post here:

Thanks for the reply. I am starting to use CouchDB with WebScraper.io, although not yet familiar with Views, eg.

However, in this particular case, I was looking for a "dummy" solution for a client to routinely scrape simple stock price page on their own, on their own browser, output for just Excel. I think it will still be OK to accept the 2 extra columns.

I would recommend everyone to set up the CouchDB storage. Not only can it help store larger data set, you can also access the data from another networked computer, with a URL other than 127.0.0.1:5984, such as with http://192.168.1.201:5984/_utils/#/_all_dbs

On the CouchDB page above, you can also have a clean list overview if you have several scraping instances.

Cheers!

You can just highlight the columns in excel and select hide/delete