Enumerating targets using searches

Searches are jobs that Ocular runs intended to enumerate targets to run pipelines over. Searches are the execution of a crawler. A search can either be scheduled to run (using a cron schedule expression) or run ad-hoc, by invoking an API endpoint.

In this example, we will implement a simple use-case of scanning all GitHub repositories that are apart of the Crash Override GitHub organization crashappsec.

Step 1. Selecting a Crawler

Ocular comes bundled with a default GitHub crawler, which has support for enumerating all github repositories that are apart of a list of GitHub orgs given as a parameter GITHUB_ORGS. Additionally the crawler will need two other parameters: PROFILE which is the profile to use for pipelines that are created, and DOWNLOADER which is the downloader to use for pipelines that are created. (You can this if you get the definition of the GitHub crawler using the endpoint GET /api/v1/crawlers/github) For more information on the default crawlers see the default integrations manual section

NOTE: The GitHub crawler supports using an authenticated token by setting the secret crawler-github-token.

Lets now start the search. We will need to supply:

  • The crawler name (in this case github)
  • The parameters

The snippet below shows an example curl command. We set the org to be our crashappsec org, the downloader to be the default git downloader named git, and the profile to the one we created in the quick start guide example.

curl -fsSL "${OCULAR_API_HOST}/api/v1/searches" \
	-X POST \
	-H "Authorization: Bearer ${OCULAR_API_TOKEN}" \
	-H "Accept: application/json" -H "Content-Type: application/json" \
	-d '{ "crawlerName": "github", "parameters": {"GITHUB_ORGS": "crashappsec", "DOWNLOADER": "git", "PROFILE": "example"}'

This will start the search as a kubernetes job. After some time you will pipelines begin to be created for each repo in the org. By default the GitHub crawler starts them 2 minutes apart, but this can be customized with the SLEEP_DURATION parameter

If we wanted to run this search every night at midnight, We can instead use the endpoint /api/v1/searched/searches.

This endpoint accepts and additional schedule string that is a cron schedule expression for when to run the search, in this case it should be 0 0 * * *

curl -fsSL "${OCULAR_API_HOST}/api/v1/scheduled/searches" \
	-X POST \
	-H "Authorization: Bearer ${OCULAR_API_TOKEN}" \
	-H "Accept: application/json" -H "Content-Type: application/json" \
	-d '{ "crawlerName": "github", "schedule": "0 0 * * *", "parameters": {"GITHUB_ORGS": "crashappsec", "DOWNLOADER": "git", "PROFILE": "example"}'

Now, every night at midnight the search will run the GitHub crawler, with the parameters set above.

Summary

In this guide, you’ve learned how to:

  1. Start a search and enumerate scan targets
  2. Schedule a search