Crash Override
Home / Blog / Security /

Get Ahead of Secrets Leaks: A Step-by-Step Guide to Configuring Ocular for Automated GitHub Scanning with Trufflehog

By Bryce Thuilot

Get Ahead of Secrets Leaks: A Step-by-Step Guide to Configuring Ocular for Automated GitHub Scanning with Trufflehog

Secrets in source code are one of the simplest ways attackers gain a foothold, and one of the easiest things to prevent, if you have the right guardrails in place. Ocular gives security engineers the power to define and automate those guardrails across the entire software lifecycle.

In this walkthrough, we’ll show how to configure Ocular to automatically scan every GitHub repository in your organization for leaked secrets using Trufflehog. You’ll learn how to define a scanning profile, configure a results uploader, and schedule weekly scans, all in just a few YAML files and curl commands.

This guide assumes you already have an Ocular instance running and ready to accept authenticated API requests. If not, head to the Ocular documentation to get started.

Step 1. Define the plan

One of the most basic, but critical, forms of SAST scanning is detecting credentials or sensitive data inside public or shared source code. In this example, we’ll configure Ocular to scan all GitHub repositories in our organization using Trufflehog, one of the most popular tools for credential detection. The scan will run weekly, and the results will be uploaded to an external service for processing.

The goal: create a recurring pipeline that detects secrets, processes the output, and sends it downstream, all automatically.

In the following steps, you’ll configure the necessary uploader and scanning profile, and wire it all together with a scheduled search. This setup gives your organization a lightweight but effective defense against exposed secrets in source control.

Note: In the code snippets that follow, be sure to replace $OCULAR_API_HOST with your Ocular instance host, and $OCULAR_API_TOKEN with an authenticated token. Refer to the authentication docs for details.

Step 2. Configure the uploader

To process scan results, Ocular uses an “uploader”, a container that handles post-scan actions like invoking APIs, writing to databases, or triggering downstream workflows. In this example, we’ll assume you have an AWS Lambda function called Trufflehog-Processorthat accepts JSON Lines output from Trufflehog.

To integrate this Lambda with Ocular, we’ll define an uploader that runs inside your Kubernetes cluster and uses the AWS CLI to invoke the function with each scan artifact.

First, grant the uploader container permission to invoke the Lambda. We’ll assume your cluster and Lambda reside in the same AWS account and that a Kubernetes service account already exists, tied to an IAM role with the necessary permissions. During Helm installation, you can pass the service account name to Ocular using the api.runtime.uploadersServiceAccount value.

Next, define the uploader itself in YAML. This configuration is a simplified Kubernetes container spec that tells Ocular how to invoke the Lambda. The args section uses a Bash loop to iterate through all scan artifact file paths (passed as CLI arguments prefixed by --) and invoke the Lambda with each one. We also define a required parameter, LAMBDA_NAME, which is injected into the container as an environment variable prefixed with OCULAR_PARAM_.

Save this definition as uploader.yaml:

# uploader.yaml
image: amazon/aws-cli:latest # image to use
imagePullPolicy: IfNotPresent # pull policy for image
command: ['/bin/bash', '-c'] # entrypoint of container
args: # arguments feed to container
  - |
    # This script works because the file paths
    # will be appended to 'args' and when using bash -c
    # any args after the script will be set at $@,
    # we just need to remove the first element ('--')
    # and we can iterate through all files
    # TLDR we will loop through each file
    # calling the invoke function with the payload set to read from the file
    for file in "${@:1}"; do
        aws lambda invoke --function-name $OCULAR_PARAM_LAMBDA_NAME --payload "file://$file"
    done
parameters:
    LAMBDA_NAME:
      description: The name of lambda to invoke
      required: true

Finally, create the uploader in your Ocular instance using the API:

curl -fsSL "${OCULAR_API_HOST}/api/v1/uploaders/trufflehog-lambda" \
    -X POST \
    -H "Authorization: Bearer ${OCULAR_API_TOKEN}" \
    -H "Accept: application/yaml" -H "Content-Type: application/x-yaml" \
    -d @uploader.yaml

Once complete, your Ocular instance can now invoke your Lambda with scan results, ready for the next step.

Step 3. Configure the scanning profile

In Ocular, a “profile” defines which scanner to run and where to send the results. In this case, we’ll use Trufflehog to detect secrets in Git repositories and send the output to our previously defined trufflehog-lambda uploader.

As with the uploader, we’ll define the profile in YAML. Save the following configuration as trufflehog-scanner.yaml

# trufflehog-scanner.yaml
scanners:
    - image: trufflesecurity/trufflehog:latest
      command: ["/bin/sh", "-c"]
      args: ["trufflehog git file://. --json --no-update > $OCULAR_RESULTS_DIR/trufflehog-output.jsonl"]
artifacts:
    - trufflehog-output.jsonl
uploaders:
    - name: lambda
      parameters:
          LAMBDA_NAME: "Trufflehog-Processor"

Here’s what each section does:

  • “Scanners” defines the container image and command to run Trufflehog’s Git scanner. Ocular ensures the working directory is set to the downloaded target repository. We write to a file located in the $OCULAR_RESULTS_DIR folder, which is where all artifacts should be sent for collection.
  • “Artifacts” specifies the file to be uploaded after scanning. This must match the output path used in the scanner command. These files should all be relative to the $OCULAR_RESULTS_DIR directory.
  • “Uploaders” lists the configured uploader to invoke, along with the required parameters, in this case, the lambda function name

Use the following command to register this profile in Ocular: 

curl -fsSL "${OCULAR_API_HOST}/api/v1/profiles/trufflehog-scanner" \
    -X POST \
    -H "Authorization: Bearer ${OCULAR_API_TOKEN}" \
    -H "Accept: application/yaml" -H "Content-Type: application/x-yaml" \
    -d @trufflehog-scanner.yaml

Before scheduling this to run automatically, it’s smart to validate your setup with a one-off scan. You can do this by starting a pipeline manually using the git downloader bundled with Ocular.

A “downloader” in Ocular is a container that pulls the target, such as a Git repository, onto disk, so the scanner can run against it. Using “git” as the downloader means Ocular will clone the specified repository and run your configured profile in the same directory. 

The example below triggers a pipeline to run the “trufflehog-scanner” profile on the repository https://github.com/crashappsec/ocular

curl -fsSL "${OCULAR_API_HOST}/api/v1/pipelines" \
    -X POST \
    -H "Authorization: Bearer ${OCULAR_API_TOKEN}" \
    -H "Accept: application/json" -H "Content-Type: application/json" \
    -d '{"profileName":"trufflehog-scanner", "target": {"downloader": "git", "identifier": "https://github.com/crashappsec/ocular"}}'

This should spin up two Kubernetes jobs in your cluster, one for the downloader and scanners and one for the uploader. After both jobs complete, you should see your Lambda function invoked with the output. 

Step 4. Configure Searches

With the scanning profile and uploader in place, the final step is to automate everything. In Ocular, this is done with a “search,” a scheduled task that runs a crawler to enumerate targets and trigger scans.

In this example, we’ll use Ocular’s built-in GitHub crawler to find every repository in your organization, run a scan using the trufflehog-scanner profile, and repeat the process every Sunday at midnight. 

The crawler will authenticate using your configured Ocular credentials and start pipelines for each discovered repo. You can define this scheduled search using the following curl command: 

curl -fsSL "${OCULAR_API_HOST}/api/v1/scheduled/searches" \
    -X POST \
    -H "Authorization: Bearer ${OCULAR_API_TOKEN}" \
    -H "Accept: application/json" -H "Content-Type: application/json" \
    -d '{ "crawlerName": "github", "schedule": "0 0 * * 0", "parameters": {"GITHUB_ORGS": "crashappsec", "DOWNLOADER": "git", "PROFILE": "example"}'

This configuration instructs Ocular to:

  • Use the GitHub crawler to list repositories under the crashappsec org
  • Download each repo using the git downloader 
  • Run the trufflehog-scanner profile
  • Execute the search every Sunday at 00:00

Once scheduled, Ocular will continuously monitor and scan your codebase for exposed secrets without additional manual effort!

That’s it, your organization is now proactively scanning for exposed secrets across every GitHub repository, every week, without lifting a finger. With Ocular and Trufflehog working together, you’re not just checking a box, putting real guardrails in place to catch risky leaks before they go live.

For more information on anything discussed, be sure to check out the Ocular documentation, and be sure to subscribe to our newsletter to stay updated on all things Crash Override!