Live Query

With Live Query, you can ask questions of endpoints and quickly identify areas for improving security and IT hygiene.

You can use recommended queries created by Carbon Black security experts or craft your own SQL queries. Live Query is powered by, an open source project that uses an SQLite interface. This guide will get you started using Live Query via the Python SDK.

More information about the Audit and Remediation product which uses Live Query is available in the Carbon Black Cloud user guide

More information about Live Query APIs is available on the Developer Network.


This guide shows how to find specific files on a system. This is the same scenario as the Quick Start Guide for the APIs on the Developer Network

The steps we’ll go through are:

  1. Set up the python imports and Carbon Black Cloud credentials

  2. Start the Query Run

  3. Look at the results

  4. Write the results to a file

  5. Clean up since this is a tutorial

  6. Get the run information for scheduled queries (templates)

Setting up

The code snippets assume that the python environment has been set up with the necessary imports and credentials.

>>> from cbc_sdk import CBCloudAPI
>>> from cbc_sdk import audit_remediation
>>> from cbc_sdk.audit_remediation import Run, RunHistory, Result, ResultQuery
>>> api = CBCloudAPI(profile='sample')

For more information on credential handling in the SDK, see Authentication

Start a Query Run

Set up the query for the file you are looking for. Then create a query object, execute it, and get the id of the run.

>>> query_string = "SELECT filename, path FROM file WHERE path = 'C:\\Windows\\Temp\\dbutil_2_3.sys\\' OR path LIKE 'C:\\Users\\%\\AppData\\Local\\Temp\\dbutil_2_3.sys';"
>>> query_object =
>>> run = query_object.submit()
>>> print(f'Run id: {} has {run.active_org_devices} active devices in the org of which {run.in_progress_count} are in progress and {run.not_started_count} have not started')
Run id: vsc2be500dcuhc1q5bhvq7kdwoqh367i has 97 active devices in the org of which 0 are in progress and 97 have not started

Check status

Give the run a few seconds to initialise, then refresh the information and print some statistics.

>>> run.refresh()
>>> print(f'Run id: {} has {run.active_org_devices} active devices in the org of which {run.in_progress_count} are in progress and {run.not_started_count} have not started')
Run id: vsc2be500dcuhc1q5bhvq7kdwoqh367i has 97 active devices in the org of which 45 are in progress and 33 have not started

The run status returns all the information about the progress of query execution. These are some of the interesting fields that show the number of devices available to be queried and progress.

  • active_org_devices: 97

  • error_count: 3

  • in_progress_count: 45

  • last_result_time: 2021-12-23T21:21:26.437Z

  • match_count: 0

  • no_match_count: 45

  • not_started_count: 40

  • status: ACTIVE

  • total_results: 0

All details of the run can be pretty printed with
>>> print(run)

Get the results

Partial results can be reviewed while the query is running. This snippet gets the results and prints the device information for each.

>>> result_query =
>>> list_result = list(result_query)
>>> for result in list_result:
>>>     print(f'Device: {} has status {result.status}.  Device message: {result.device_message}')
Device: 1234578 has status matched.  Device message:
Device: 3456789 has status error.  Device message: Error: database or disk is full
Device: 8765432 has status matched.  Device message:
There is also a helper option to get the results:
>>> results_by_helper = run.query_results()

Export results

It is possible to export the results in several formats including csv, zipped csv and streaming lines. These options are documented in cbc_sdk.audit_remediation.base.ResultQuery()

This snippet shows writing the results to a zipped csv file.

>>> result_query.export_zipped_csv("/Users/myname/mydir/")

For very large result sets there is an asynchronous API call. The SDK makes use of Python Futures to wait for the underlying call to complete.

For this call, in addition to live query permissions the API Key will require jobs.status(READ).

The sequence of calls are:

>>> # first an extra import
>>> from cbc_sdk.platform import Job
>>> # then start the job
>>> job = result_query.async_export()
>>> # show the status in progress
>>> print(job.status)
>>> # wait for it to finish and refresh the information in the SDK
>>> job_future = job.await_completion()
>>> finished_job = job_future.result()
>>> finished_job.refresh()
>>> # show the job has completed
>>> print(finished_job.status)
>>> # write the results to a csv file
>>> finished_job.get_output_as_file("/Users/myname/mydir/livequeryresults_async.csv")

Scroll results

If you would like to ingest all the Live Query results whether that be from one Run or multiple Runs consider using the scroll option to fetch the latest results. The scroll option is limited to the last 24 hours for results across all Runs. You either need to specify a time_received or a list of one or more Run ids

>>> result_query ="-3h")
>>> list_results = result_query.scroll(10)
>>> print(f"num_remaining: {result_query.num_remaining}")
num_remaining: 35
>>> while result_query.num_remaining > 0:
>>>     list_results.extend(result_query.scroll(10))
>>> print(f"total results: {len(list_results)}")
total_results: 45

Alternatively if you wanted to get all the results over multiple days for a single Run then use the Run’s id

>>> result_query =[])
>>> list_results = result_query.scroll(10)
>>> print(f"num_remaining: {result_query.num_remaining}")
num_remaining: 62
>>> while result_query.num_remaining > 0:
>>>     list_results.extend(result_query.scroll(10))
>>> print(f"total results: {len(list_results)}")
total_results: 72

Clean up

Since this is a tutorial we’ll clean up when we’re done by first stopping the run and then deleting it.

Stopping the run will prevent the request going to any devices that have not yet checked in but will not stop the query running on any that are in progress. Checking in the console, the run and results will be visible with a status of Stopped.

>>> run.stop()
>>> print(run.status)

Since this is a tutorial, we can fully clean up. This deletes the results so is probably not what you usually want. It will not be visible in the console and attempting to refresh the object will return the error “cannot refresh a deleted query”.

>>> run.delete()

Scheduled runs (templates)

A template is a query that is scheduled to run periodically. It is likely easier to configured these using the Carbon Black Cloud console, but retrieving the result for import to another system may be useful.

An additional import:

>>> from cbc_sdk.audit_remediation import Template, TemplateHistory

List all the templates (scheduled queries):

>>> all_templates =
>>> for t in list(all_templates):
>>>     print(f'Name = {}, id = {}, next run time = {t.next_run_time}')

A where clause can be added to limit the templates returned. Each time the scheduled query has executed is a run.

>>> templates = list("CBC SDK Demo Template"))
>>> for template in templates:
>>>     print(f'template name = {}, id = {}, next run time = {t.next_run_time}')
>>>     # and then get all the runs for each template
>>>     runs = list(,
>>>     for run in runs:
>>>         print(f'Run id = {}, Run Status = {run.status}, Run create time = {run.create_time}, Results Returned = {run.total_results}, Template Id = {run.template_id}')
name =  CBC SDK Demo Template   id =  p7qtvxms0oaju46whcrfmyppa9fiqpn9
Run id = huoobhistdtxxpzhmg52yns7wmsuvjyx, Run Status = ACTIVE, Run create time = 2022-01-19T21:00:00.000Z, Results Returned = 2333, Template Id = p7qtvxms0oaju46whcrfmyppa9fiqpn9
Run id = bdygnd8jvpjdqjmatdsuqzopaxebquqb, Run Status = TIMED_OUT, Run create time = 2022-01-18T21:00:00.000Z, Results Returned = 2988, Template Id = p7qtvxms0oaju46whcrfmyppa9fiqpn9