Introduction

Diggernaut provides you with a convenient opportunity to manage your scrapers (we call them diggers), start them manually or automatically at a specified time on a schedule, download data in required format, debug configs, control the processes of data collection from sources, compile scrapers to the platform you need (to run outside of our service, for example, on your computer) and a number of other useful functions, which we will consider in more detail in the other sections of documentation.

Basically, the work with the service looks like:

  1. Create new project.
  2. Create new digger in the project.
  3. Upload config into the digger or write config in the editor.
  4. Start the digger in debug mode.
  5. Look at dataset.
  6. Look at log and fix issues.
  7. Turn the digger to active mode.
  8. Start the digger in active mode.
  9. Download dataset in the required format.

All operations with scrapers are easy to use and intuitive.

Let's go ahead and see how it works.