API Reference

Tip: get help directly in JupyterLab

The API documentation shown below for the command line interface (CLI) and Python client can be referenced at any time directly from JupyterLab.

To view Python documentation in a Jupyter Notebook or the Jupyter text editor, click on the module or function name and press the Control key:

JupyterLab hover documentation

To view the API documentation for the CLI, use the -h/--help option with any command or subcommand:

$ quantrocket zipline create-usstock-bundle -h

quantrocket.account

account service

QuantRocket account CLI

usage: quantrocket account [-h] {balance,portfolio,rates} ...

subcommands

subcommand

Possible choices: balance, portfolio, rates

Sub-commands

balance

query account balances

quantrocket account balance [-h] [-s YYYY-MM-DD] [-e YYYY-MM-DD] [-l]
                            [-a [ACCOUNT ...]] [-b [FIELD:AMOUNT ...]]
                            [-o OUTFILE] [-j] [-f [FIELD ...]]
                            [--force-refresh]

Named Arguments

--force-refresh

refresh account balances to ensure the latest data (default is to query the database, which is refreshed every minute)

Default: False

filtering options

-s, --start-date

limit to account balance snapshots taken on or after this date

-e, --end-date

limit to account balance snapshots taken on or before this date

-l, --latest

return the latest account balance snapshot

Default: False

-a, --accounts

limit to these accounts

-b, --below

limit to accounts where the specified field is below the specified amount (pass as field:amount, for example Cushion:0.05)

output options

-o, --outfile

filename to write the data to (default is stdout)

-j, --json

format output as JSON (default is CSV)

-f, --fields

only return these fields. By default a core set of fields is returned. Pass a list of fields, or ‘*’ to return all fields. Pass ‘?’ or any invalid fieldname to see available fields.

Query account balances.

Notes

Usage Guide:

Examples

Query the latest account balances.

quantrocket account balance --latest

Query the latest NLV (Net Liquidation Value) for a particular account:

quantrocket account balance --latest --fields NetLiquidation --accounts U123456

Check for accounts that have fallen below a 5% cushion and log the results, if any, to flightlog:

quantrocket account balance --latest --below Cushion:0.05 | quantrocket flightlog log --name quantrocket.account --level CRITICAL

Query historical account balances over a date range:

quantrocket account balance --start-date 2017-06-01 --end-date 2018-01-31

portfolio

download current portfolio

quantrocket account portfolio [-h] [-b [BROKER ...]] [-a [ACCOUNT ...]]
                              [-t [SEC_TYPE ...]] [-e [EXCHANGE ...]]
                              [-i [SID ...]] [-s [SYMBOL ...]] [-z]
                              [-o OUTFILE] [-j] [-f [FIELD ...]]

filtering options

-b, --brokers

Possible choices: alpaca, ibkr

limit to these brokers. Possible choices: [‘alpaca’, ‘ibkr’]

-a, --accounts

limit to these accounts

-t, --sec-types

limit to these security types

-e, --exchanges

limit to these exchanges

-i, --sids

limit to these sids

-s, --symbols

limit to these symbols

-z, --zero

include zero position rows (default is to exclude them). Only supported for Interactive Brokers.

Default: False

output options

-o, --outfile

filename to write the data to (default is stdout)

-j, --json

format output as JSON (default is CSV)

-f, --fields

only return these fields. By default a core set of fields is returned. Pass a list of fields, or ‘*’ to return all fields. Pass ‘?’ or any invalid fieldname to see available fields.

Download current portfolio.

Notes

Usage Guide:

Examples

View current portfolio in terminal:

quantrocket account portfolio | csvlook

Download current portfolio for a particular account and save to file:

quantrocket account portfolio --accounts U12345 -o portfolio.csv

rates

query exchange rates for the base currency

quantrocket account rates [-h] [-s YYYY-MM-DD] [-e YYYY-MM-DD] [-l]
                          [-b [CURRENCY ...]] [-q [CURRENCY ...]] [-o OUTFILE]
                          [-j]

filtering options

-s, --start-date

limit to exchange rates on or after this date

-e, --end-date

limit to exchange rates on or before this date

-l, --latest

return the latest exchange rates

Default: False

-b, --base-currencies

limit to these base currencies

-q, --quote-currencies

limit to these quote currencies

output options

-o, --outfile

filename to write the data to (default is stdout)

-j, --json

format output as JSON (default is CSV)

Query exchange rates for the base currency.

The exchange rates in the exchange rate database are sourced from the European Central Bank’s reference rates, which are updated each day at 4 PM CET.

Notes

Usage Guide:

Examples

Query the latest exchange rates.

quantrocket account rates --latest
quantrocket.account.download_account_balances(filepath_or_buffer=None, output='csv', start_date=None, end_date=None, latest=False, accounts=None, below=None, fields=None, force_refresh=False)

Query account balances.

Parameters:
  • filepath_or_buffer (str or file-like object) – filepath to write the data to, or file-like object (defaults to stdout)

  • output (str) – output format (json or csv, default is csv)

  • start_date (str (YYYY-MM-DD), optional) – limit to account balance snapshots taken on or after this date

  • end_date (str (YYYY-MM-DD), optional) – limit to account balance snapshots taken on or before this date

  • latest (bool) – return the latest account balance snapshot

  • accounts (list of str, optional) – limit to these accounts

  • below (dict of FIELD:AMOUNT, optional) – limit to accounts where the specified field is below the specified amount (pass as {field:amount}, for example {‘Cushion’:0.05})

  • fields (list of str, optional) – only return these fields. By default a core set of fields is returned. Pass a list of fields, or ‘*’ to return all fields. Pass [‘?’] or any invalid fieldname to see available fields.

  • force_refresh (bool) – refresh account balances to ensure the latest data (default is to query the database, which is refreshed every minute)

Return type:

None

Notes

Usage Guide:

Examples

Query latest balances. You can use StringIO to load the CSV into pandas.

>>> f = io.StringIO()
>>> download_account_balances(f, latest=True)
>>> balances = pd.read_csv(f, parse_dates=["LastUpdated"])
quantrocket.account.download_account_portfolio(filepath_or_buffer=None, output='csv', brokers=None, accounts=None, sec_types=None, exchanges=None, sids=None, symbols=None, include_zero=False, fields=None)

Download current portfolio.

Parameters:
  • filepath_or_buffer (str or file-like object) – filepath to write the data to, or file-like object (defaults to stdout)

  • output (str) – output format (json or csv, default is csv)

  • brokers (list of str, optional) – limit to these brokers. Possible choices: alpaca, ibkr

  • accounts (list of str, optional) – limit to these accounts

  • sec_types (list of str, optional) – limit to these security types

  • exchanges (list of str, optional) – limit to these exchanges

  • sids (list of str, optional) – limit to these sids

  • symbols (list of str, optional) – limit to these symbols

  • include_zero (bool) – include zero position rows (default is to exclude them). Only supported for Interactive Brokers.

  • fields (list of str, optional) – only return these fields. By default a core set of fields is returned. Pass a list of fields, or ‘*’ to return all fields. Pass ‘?’ or any invalid fieldname to see available fields.

Return type:

None

Notes

Usage Guide:

Examples

Download current portfolio. You can use StringIO to load the CSV into pandas.

>>> f = io.StringIO()
>>> download_account_portfolio(f)
>>> portfolio = pd.read_csv(f, parse_dates=["LastUpdated"])
quantrocket.account.download_exchange_rates(filepath_or_buffer=None, output='csv', start_date=None, end_date=None, latest=False, base_currencies=None, quote_currencies=None)

Query exchange rates for the base currency.

The exchange rates in the exchange rate database are sourced from the European Central Bank’s reference rates, which are updated each day at 4 PM CET.

Parameters:
  • filepath_or_buffer (str or file-like object) – filepath to write the data to, or file-like object (defaults to stdout)

  • output (str) – output format (json, csv, default is csv)

  • start_date (str (YYYY-MM-DD), optional) – limit to exchange rates on or after this date

  • end_date (str (YYYY-MM-DD), optional) – limit to exchange rates on or before this date

  • latest (bool) – return the latest exchange rates

  • base_currencies (list of str, optional) – limit to these base currencies

  • quote_currencies (list of str, optional) – limit to these quote currencies

Return type:

None

Notes

Usage Guide:

Examples

Query latest exchange rates. You can use StringIO to load the CSV into pandas.

>>> f = io.StringIO()
>>> download_exchange_rates(f, latest=True)
>>> rates = pd.read_csv(f, parse_dates=["Date"])
 

Account API

Resource Group

Account Balances

Get Account Balances
GET/account/balances.{output}{?start_date,end_date,latest,accounts,below,fields,force_refresh}

Query account balances.

Example URI

GET http://houston/account/balances.csv?start_date=2017-01-01&end_date=2018-01-01&latest=true&accounts=U123456&below=Cushion:0.05&fields=NetLiquidation&force_refresh=true
URI Parameters
output
str (required) Example: csv

output format

Choices: csv json

start_date
str (optional) Example: 2017-01-01

limit to account balance snapshots taken on or after this date

end_date
str (optional) Example: 2018-01-01

limit to account balance snapshots taken on or before this date

latest
bool (optional) Example: true

return the latest account balance snapshot

accounts
str (optional) Example: U123456

limit to these accounts (pass multiple times for multiple accounts)

below
str (optional) Example: Cushion:0.05

limit to accounts where the specified field is below the specified amount (pass as ‘field:amount’, for example ‘Cushion:0.05’) (pass multiple times for multiple filters)

fields
str (optional) Example: NetLiquidation

only return these fields (pass ‘?’ or any invalid fieldname to see available fields) (pass multiple times for multiple fields)

force_refresh
bool (optional) Example: true

refresh account balances to ensure the latest data (default is to query the database, which is refreshed every minute)

Response  200
Headers
Content-Type: text/csv
Body
Broker,Account,Currency,NetLiquidation,LastUpdated
ibkr,DU123456,USD,500000.0,"2017-12-16 14:28:54"

Exchange rates

Get Exchange Rates
GET/account/rates.{output}{?start_date,end_date,latest,base_currencies,quote_currencies}

Query exchange rates for the base currency.

The exchange rates in the exchange rate database are sourced from the European Central Bank’s reference rates, which are updated each day at 4 PM CET.

Example URI

GET http://houston/account/rates.csv?start_date=2017-01-01&end_date=2018-01-01&latest=true&base_currencies=USD&quote_currencies=CAD
URI Parameters
output
str (required) Example: csv

output format

Choices: csv json

start_date
str (optional) Example: 2017-01-01

limit to exchange rates on or after this date

end_date
str (optional) Example: 2018-01-01

limit to exchange rates on or before this date

latest
bool (optional) Example: true

return the latest exchange rates

base_currencies
str (optional) Example: USD

limit to these base currencies (pass multiple times for multiple base currencies)

quote_currencies
str (optional) Example: CAD

limit to these quote currencies (pass multiple times for multiple quote currencies)

Response  200
Headers
Content-Type: text/csv
Body
BaseCurrency,QuoteCurrency,Rate,Date
USD,AUD,1.2774,2018-01-09
USD,CAD,1.2425,2018-01-09
USD,CHF,0.98282,2018-01-09

Account Portfolio

Get Account Portfolio
GET/account/portfolio.{output}{?brokers,accounts,sec_types,exchanges,sids,symbols,include_zero,fields}

Download current portfolio.

Example URI

GET http://houston/account/portfolio.csv?brokers=ibkr&accounts=U12345&sec_types=STK&exchanges=XNYS&sids=FI12345&symbols=AAPL&include_zero=true&fields=Sid
URI Parameters
output
str (required) Example: csv

output format

Choices: csv json

brokers
str (optional) Example: ibkr

limit to these brokers.

Choices: alpaca ibkr

accounts
str (optional) Example: U12345

limit to these accounts

sec_types
str (optional) Example: STK

limit to these security types

exchanges
str (optional) Example: XNYS

limit to these exchanges

sids
str (optional) Example: FI12345

limit to these sids

symbols
str (optional) Example: AAPL

limit to these symbols

include_zero
bool (optional) Example: true

include zero position rows (default is to exclude them). Only supported for Interactive Brokers.

fields
str (optional) Example: Sid

only return these fields. By default a core set of fields is returned. Pass a list of fields, or ‘*’ to return all fields. Pass ‘?’ or any invalid fieldname to see available fields.

Response  200
Headers
Content-Type: text/csv

quantrocket.blotter

order management and trade ledger service

QuantRocket blotter CLI

usage: quantrocket blotter [-h]
                           {order,cancel,status,positions,close,executions,record,split,pnl}
                           ...

subcommands

subcommand

Possible choices: order, cancel, status, positions, close, executions, record, split, pnl

Sub-commands

order

place one or more orders

quantrocket blotter order [-h] [-f INFILE | -p [PARAM:VALUE ...]]

Named Arguments

-f, --infile

place orders from this CSV or JSON file (specify ‘-’ to read file from stdin)

-p, --params

order details as multiple key-value pairs (pass as ‘param:value’, for example OrderType:MKT)

Place one or more orders.

Returns a list of order IDs, which can be used to cancel the orders or check their status.

Notes

Usage Guide:

Examples

Place orders from a CSV file.

quantrocket blotter order -f orders.csv

Place orders from a JSON file.

quantrocket blotter order -f orders.json

Place an order by specifying the order parameters on the command line:

quantrocket blotter order --params Sid:FIBBG123456 Action:BUY Exchange:SMART TotalQuantity:100 OrderType:MKT Tif:Day Account:DU12345 OrderRef:my-strategy

cancel

cancel one or more orders by order ID, sid, or order ref

quantrocket blotter cancel [-h] [-d [ORDER_ID ...]] [-i [SID ...]]
                           [-r [ORDER_REF ...]] [-a [ACCOUNT ...]] [--all]

Named Arguments

-d, --order-ids

cancel these order IDs

-i, --sids

cancel orders for these sids

-r, --order-refs

cancel orders for these order refs

-a, --accounts

cancel orders for these accounts

--all

cancel all open orders

Default: False

Cancel one or more orders by order ID, sid, or order ref.

Notes

Usage Guide:

Examples

Cancel orders by order ID:

quantrocket blotter cancel -d 6002:45 6001:46

Cancel orders by sid:

quantrocket blotter cancel -i FIBBG123456

Cancel orders by order ref:

quantrocket blotter cancel --order-refs my-strategy

Cancel all open orders:

quantrocket blotter cancel --all

status

download order statuses

quantrocket blotter status [-h] [-d [ORDER_ID ...]] [-i [SID ...]]
                           [-r [ORDER_REF ...]] [-a [ACCOUNT ...]] [--open]
                           [-s YYYY-MM-DD] [-e YYYY-MM-DD] [-f [FIELD ...]]
                           [-o OUTFILE] [-j]

filtering options

-d, --order-ids

limit to these order IDs

-i, --sids

limit to orders for these sids

-r, --order-refs

limit to orders for these order refs

-a, --accounts

limit to orders for these accounts

--open

limit to open orders

Default: False

-s, --start-date

limit to orders submitted on or after this date

-e, --end-date

limit to orders submitted on or before this date

output options

-f, --fields

return these fields in addition to the default fields (pass ‘?’ or any invalid fieldname to see available fields)

-o, --outfile

filename to write the data to (default is stdout)

-j, --json

format output as JSON (default is CSV)

Download order statuses.

Notes

Usage Guide:

Examples

Download order status by order ID and save to file:

quantrocket blotter status -d 6002:45 6001:46 -o statuses.csv

Download order status for all open orders and display in terminal:

quantrocket blotter status --open | csvlook

Download order status with extra fields and display as YAML:

quantrocket blotter status --open --fields Exchange LmtPrice --json | json2yaml

Download order status of open orders by sid:

quantrocket blotter status -i FIBBG123456 --open

Download order status of open orders by order ref:

quantrocket blotter status --order-refs my-strategy --open

positions

query current positions

quantrocket blotter positions [-h] [-i [SID ...]] [-r [ORDER_REF ...]]
                              [-a [ACCOUNT ...]] [--diff] [--broker]
                              [-o OUTFILE] [-j]

filtering options

-i, --sids

limit to these sids

-r, --order-refs

limit to these order refs (not supported with broker view)

-a, --accounts

limit to these accounts

--diff

limit to positions where the blotter quantity and broker quantity disagree (requires –broker)

Default: False

output options

--broker

return ‘broker’ view of positions (by account and sid) instead of default ‘blotter’ view (by account, sid, and order ref)

-o, --outfile

filename to write the data to (default is stdout)

-j, --json

format output as JSON (default is CSV)

Query current positions.

There are two ways to view positions: blotter view (default) and broker view.

The default “blotter view” returns positions by account, sid, and order ref. Positions are tracked based on execution records saved to the blotter database.

“Broker view” (using the –broker option) returns positions by account and sid (but not order ref) as reported directly by the broker.

Notes

Usage Guide:

Examples

Query current positions:

quantrocket blotter positions

Save current positions to CSV file:

quantrocket blotter positions --outfile positions.csv

Query positions for a single order ref:

quantrocket blotter positions --order-refs my-strategy

Query positions using broker view:

quantrocket blotter positions --broker

close

generate orders to close positions

quantrocket blotter close [-h] [-i [SID ...]] [-r [ORDER_REF ...]]
                          [-a [ACCOUNT ...]] [-o OUTFILE]
                          [-p [PARAM:VALUE ...]] [-j]

filtering options

-i, --sids

limit to these sids

-r, --order-refs

limit to these order refs

-a, --accounts

limit to these accounts

output options

-o, --outfile

filename to write the data to (default is stdout)

-p, --params

additional parameters to append to each row in output (pass as ‘param:value’, for example OrderType:MKT)

-j, --json

format output as JSON (default is CSV)

Generate orders to close positions.

Doesn’t actually place any orders but returns an orders file that can be placed separately. Additional order parameters can be appended with the –params option.

This endpoint can also be used to generate executions for marking a position as closed due to a tender offer, merger/acquisition, etc. (See quantrocket blotter record for more info.)

Notes

Usage Guide:

Examples

Generate MKT orders to close positions for a particular strategy:

quantrocket blotter close --order-refs my-strategy --params OrderType:MKT Tif:DAY Exchange:SMART

Generate orders and also place them:

quantrocket blotter close -r my-strategy -p OrderType:MKT Tif:DAY Exchange:SMART | quantrocket blotter order -f -

After receiving 23.50 per share in a tender offer for a position, record the execution in the blotter in order to mark the position as closed:

quantrocket blotter close --sids FIBBG123456 --params Price:23.50 | quantrocket blotter record -f -

executions

query executions from the executions database

quantrocket blotter executions [-h] [-i [SID ...]] [-r [ORDER_REF ...]]
                               [-a [ACCOUNT ...]] [-s YYYY-MM-DD]
                               [-e YYYY-MM-DD] [-o OUTFILE]

filtering options

-i, --sids

limit to these sids

-r, --order-refs

limit to these order refs

-a, --accounts

limit to these accounts

-s, --start-date

limit to executions on or after this date

-e, --end-date

limit to executions on or before this date

output options

-o, --outfile

filename to write the data to (default is stdout)

Query executions from the executions database.

Notes

Usage Guide:

Examples

Get a CSV of all executions:

quantrocket blotter executions -o executions.csv

record

record executions that happened outside of QuantRocket’s knowledge

quantrocket blotter record [-h] [-f INFILE | -p [PARAM:VALUE ...]]

Named Arguments

-f, --infile

record executions from this CSV or JSON file (specify ‘-’ to read file from stdin)

-p, --params

execution details as multiple key-value pairs (pass as ‘param:value’, for example Price:23.50)

Record executions that happened outside of QuantRocket’s knowledge.

This endpoint does not interact with the broker but simply adds one or more executions to the blotter database and updates the blotter’s record of current positions accordingly. It can be used to bring the blotter in line with the broker when they differ. For example, when a position is liquidated because of a tender offer or merger/acquisition, you can use this endpoint to record the price received for your shares.

Returns a list of execution IDs inserted into the database.

Notes

Usage Guide:

The required params are:

  • Account

  • Action (“BUY” or “SELL”)

  • OrderRef

  • Price

  • Sid

  • TotalQuantity

Optional params (rarely needed):

  • Commission (default is 0)

  • OrderId (default is an auto-generated ID)

  • Time (the time of execution, default is now)

Examples

After receiving 23.50 per share in a tender offer for a position, record the execution in the blotter in order to mark the position as closed:

quantrocket blotter close --sids FIBBG123456 --params Price:23.50 | quantrocket blotter record -f -

Record executions from a CSV file:

quantrocket blotter record -f executions.csv

Record an execution by specifying the parameters on the command line:

quantrocket blotter record --params Sid:FIBBG123456 Action:BUY TotalQuantity:100 Account:DU12345 OrderRef:my-strategy Price:23.50

split

apply a stock split to an open position

quantrocket blotter split [-h] -i SID -o INT -n INT

Named Arguments

-i, --sid

the sid that underwent a split. There must currently be an open position in this security.

-o, --old-shares

the number of pre-split shares

-n, --new-shares

the number of post-split shares

Apply a stock split to an open position.

This endpoint does not interact with the broker but simply applies the split in the blotter database to bring the blotter in line with the broker. The split is also applied to the executions that created the open position, so that PNL calculations will be accurate.

The –old-shares and –new-shares parameters can be specified either using the published split ratio (for example, 2-for-1) or the actual number of pre- and post-split shares in your account.

Notes

Usage Guide:

Examples

Record a 2-for-1 split:

quantrocket blotter split --sid FIBBG12345 --old-shares 1 --new-shares 2

Record a 1-for-10 reverse split:

quantrocket blotter split --sid FIBBG98765 --old-shares 10 --new-shares 1

pnl

query trading performance and return a PDF tearsheet or CSV of results

quantrocket blotter pnl [-h] [-i [SID ...]] [-r [ORDER_REF ...]]
                        [-a [ACCOUNT ...]] [-s YYYY-MM-DD] [-e YYYY-MM-DD]
                        [-d] [-t TIMEZONE] [--pdf] [-o OUTFILE]

filtering options

-i, --sids

limit to these sids

-r, --order-refs

limit to these order refs

-a, --accounts

limit to these accounts

-s, --start-date

limit to pnl on or after this date

-e, --end-date

limit to pnl on or before this date

output options

-d, --details

return detailed results for all securities instead of aggregating to account/order ref level (only supported for a single account and order ref at a time)

Default: False

-t, --timezone

return execution times in this timezone (default UTC)

--pdf

return a PDF tear sheet of PNL (default is to return a CSV)

-o, --outfile

filename to write the data to (default is stdout)

Query trading performance and return a PDF tearsheet or CSV of results.

Trading performance is broken down by account and order ref and optionally by sid.

Notes

Usage Guide:

Examples

Get a Moonchart PDF of all trading performance PNL:

quantrocket blotter pnl -o pnl.pdf --pdf

Get a PDF for a single account and order ref, broken down by sid:

quantrocket blotter pnl --accounts U12345 --order-refs mystrategy1 --details --pdf -o pnl_details.pdf

Get a CSV of performance results for a particular date range:

quantrocket blotter pnl -s 2018-03-01 -e 2018-06-30 -o pnl_2018Q2.csv
quantrocket.blotter.place_orders(orders=None, infilepath_or_buffer=None)

Place one or more orders.

Returns a list of order IDs, which can be used to cancel the orders or check their status.

Parameters:
  • orders (list of dict of PARAM:VALUE, optional) – a list of one or more orders, where each order is a dict specifying the order parameters (see examples)

  • infilepath_or_buffer (str or file-like object, optional) – place orders from this CSV or JSON file (specify ‘-’ to read file from stdin). Mutually exclusive with orders argument.

Returns:

order IDs

Return type:

list

Notes

Usage Guide:

Examples

>>> orders = []
>>> order1 = {
        'Sid':'FIBBG123456',
        'Action':'BUY',
        'Exchange':'SMART',
        'TotalQuantity':100,
        'OrderType':'MKT',
        'Tif':'Day',
        'Account':'DU12345',
        'OrderRef':'my-strategy'
    }
>>> orders.append(order1)
>>> order_ids = place_orders(orders)
quantrocket.blotter.cancel_orders(order_ids=None, sids=None, order_refs=None, accounts=None, cancel_all=None)

Cancel one or more orders by order ID, sid, or order ref.

Parameters:
  • order_ids (list of str, optional) – cancel these order IDs

  • sids (list of str, optional) – cancel orders for these sids

  • order_refs (list of str, optional) – cancel orders for these order refs

  • accounts (list of str, optional) – cancel orders for these accounts

  • cancel_all (bool) – cancel all open orders

Returns:

status message

Return type:

dict

Notes

Usage Guide:

Examples

Cancel orders by order ID:

>>> cancel_orders(order_ids=['6002:45','6002:46'])

Cancel orders by sid:

>>> cancel_orders(sids=["FIBBG123456"])

Cancel orders by order ref:

>>> cancel_orders(order_refs=['my-strategy'])

Cancel all open orders:

>>> cancel_orders(cancel_all=True)
quantrocket.blotter.download_order_statuses(filepath_or_buffer=None, output='csv', order_ids=None, sids=None, order_refs=None, accounts=None, open_orders=None, start_date=None, end_date=None, fields=None)

Download order statuses.

Parameters:
  • filepath_or_buffer (str or file-like object) – filepath to write the data to, or file-like object (defaults to stdout)

  • output (str) – output format (json or csv, default is csv)

  • order_ids (list of str, optional) – limit to these order IDs

  • sids (list of str, optional) – limit to orders for these sids

  • order_refs (list of str, optional) – limit to orders for these order refs

  • accounts (list of str, optional) – limit to orders for these accounts

  • open_orders (bool) – limit to open orders

  • start_date (str (YYYY-MM-DD), optional) – limit to orders submitted on or after this date

  • end_date (str (YYYY-MM-DD), optional) – limit to orders submitted on or before this date

  • fields (list of str, optional) – return these fields in addition to the default fields (pass ‘?’ or any invalid fieldname to see available fields)

Return type:

None

Notes

Usage Guide:

Examples

Download order status by order ID and load into Pandas:

>>> f = io.StringIO()
>>> download_order_statuses(f, order_ids=['6001:45','6001:46'])
>>> order_statuses = pd.read_csv(f)

Download order status for all open orders and include extra fields in output:

>>> download_order_statuses(open_orders=True, fields=["LmtPrice", "OcaGroup"])

Download order status of open orders by sid:

>>> download_order_statuses(sids=["FIBBG123456"], open_orders=True)

Download order status of open orders by order ref:

>>> download_order_statuses(order_refs=['my-strategy'], open_orders=True)
quantrocket.blotter.download_positions(filepath_or_buffer=None, output='csv', order_refs=None, accounts=None, sids=None, view='blotter', diff=False)

Query current positions and write results to file.

To return positions as a Python list, see list_positions.

There are two ways to view positions: blotter view (default) and broker view.

The default “blotter view” returns positions by account, sid, and order ref. Positions are tracked based on execution records saved to the blotter database.

“Broker view” (view=’broker’) returns positions by account and sid (but not order ref) as reported directly by the broker.

Parameters:
  • filepath_or_buffer (str or file-like object) – filepath to write the data to, or file-like object (defaults to stdout)

  • output (str) – output format (json or csv, default is csv)

  • order_refs (list of str, optional) – limit to these order refs (not supported with broker view)

  • accounts (list of str, optional) – limit to these accounts

  • sids (list of str, optional) – limit to these sids

  • view (str, optional) – whether to return ‘broker’ view of positions (by account and sid) or default ‘blotter’ view (by account, sid, and order ref). Choices are: blotter, broker

  • diff (bool) – limit to positions where the blotter quantity and broker quantity disagree (requires view=’broker’)

Return type:

None

See also

list_positions

load positions into Python list

Notes

Usage Guide:

quantrocket.blotter.list_positions(order_refs=None, accounts=None, sids=None, view='blotter', diff=False)

Query current positions and return them as a Python list.

There are two ways to view positions: blotter view (default) and broker view.

The default “blotter view” returns positions by account, sid, and order ref. Positions are tracked based on execution records saved to the blotter database.

“Broker view” (view=’broker’) returns positions by account and sid (but not order ref) as reported directly by the broker.

Parameters:
  • order_refs (list of str, optional) – limit to these order refs (not supported with broker view)

  • accounts (list of str, optional) – limit to these accounts

  • sids (list of str, optional) – limit to these sids

  • view (str, optional) – whether to return ‘broker’ view of positions (by account and sid) or default ‘blotter’ view (by account, sid, and order ref). Choices are: blotter, broker

  • diff (bool) – limit to positions where the blotter quantity and broker quantity disagree (requires view=’broker’)

Return type:

list

Notes

Usage Guide:

Examples

Query current positions and load into Pandas:

>>> positions = list_positions()
>>> if positions:
>>>     positions = pd.DataFrame(positions)
quantrocket.blotter.close_positions(filepath_or_buffer=None, output='csv', order_refs=None, accounts=None, sids=None, params=None)

Generate orders to close positions.

Doesn’t actually place any orders but returns an orders file that can be placed separately. Additional order parameters can be appended with the params argument.

This endpoint can also be used to generate executions for marking a position as closed due to a tender offer, merger/acquisition, etc. (See quantrocket.blotter.record_executions for more info.)

Parameters:
  • filepath_or_buffer (str or file-like object) – filepath to write the data to, or file-like object (defaults to stdout)

  • output (str) – output format (json or csv, default is csv)

  • order_refs (list of str, optional) – limit to these order refs

  • accounts (list of str, optional) – limit to these accounts

  • sids (list of str, optional) – limit to these sids

  • params (dict of PARAM:VALUE, optional) – additional parameters to append to each row in output (pass as {param:value}, for example {“OrderType”:”MKT”})

Return type:

None

See also

place_orders

place one or more orders

record_executions

record executions that happened outside of QuantRocket’s knowledge

Notes

Usage Guide:

Examples

Get orders to close positions, then place the orders:

>>> from quantrocket.blotter import place_orders, close_positions
>>> import io
>>> orders_file = io.StringIO()
>>> close_positions(orders_file, params={"OrderType":"MKT", "Tif":"DAY", "Exchange":"SMART"})
>>> place_orders(infilepath_or_buffer=orders_file)

After receiving 23.50 per share in a tender offer for a position, record the execution in the blotter in order to mark the position as closed:

>>> from quantrocket.blotter import record_executions
>>> executions_file = io.StringIO()
>>> close_positions(executions_file, sids="FIBBG123456", params={"Price": 23.50})
>>> record_executions(infilepath_or_buffer=executions_file)
quantrocket.blotter.download_executions(filepath_or_buffer=None, order_refs=None, accounts=None, sids=None, start_date=None, end_date=None)

Query executions from the executions database.

Parameters:
  • filepath_or_buffer (str or file-like object) – filepath to write the data to, or file-like object (defaults to stdout)

  • order_refs (list of str, optional) – limit to these order refs

  • accounts (list of str, optional) – limit to these accounts

  • sids (list of str, optional) – limit to these sids

  • start_date (str (YYYY-MM-DD), optional) – limit to executions on or after this date

  • end_date (str (YYYY-MM-DD), optional) – limit to executions on or before this date

Return type:

None

Notes

Usage Guide:

quantrocket.blotter.record_executions(executions=None, infilepath_or_buffer=None)

Record executions that happened outside of QuantRocket’s knowledge.

This endpoint does not interact with the broker but simply adds one or more executions to the blotter database and updates the blotter’s record of current positions accordingly. It can be used to bring the blotter in line with the broker when they differ. For example, when a position is liquidated because of a tender offer or merger/acquisition, you can use this endpoint to record the price received for your shares.

Parameters:
  • executions (list of dict of PARAM:VALUE, optional) –

    a list of one or more executions, where each execution is a dict specifying the execution parameters. The required params are:

    • Account

    • Action (“BUY” or “SELL”)

    • OrderRef

    • Price

    • Sid

    • TotalQuantity

    Optional params (rarely needed):

    • Commission (default is 0)

    • OrderId (default is an auto-generated ID)

    • Time (the time of execution, default is now)

  • infilepath_or_buffer (str or file-like object, optional) – record executions from this CSV or JSON file (specify ‘-’ to read file from stdin). Mutually exclusive with executions argument.

Returns:

a list of execution IDs generated by the blotter and inserted in the database

Return type:

list

See also

close_positions

generate orders to close positions, or generate executions to mark positions as closed

Notes

Usage Guide:

Examples

>>> executions = []
>>> execution1 = {
        'Sid':'FIBBG123456',
        'Action':'BUY',
        'TotalQuantity':100,
        'Account':'DU12345',
        'OrderRef':'my-strategy',
        'Price': 23.50
    }
>>> executions.append(execution1)
>>> execution_ids = record_executions(executions)
quantrocket.blotter.apply_split(sid, old_shares, new_shares)

Apply a stock split to an open position.

This endpoint does not interact with the broker but simply applies the split in the blotter database to bring the blotter in line with the broker. The split is also applied to the executions that created the open position, so that PNL calculations will be accurate.

The old_shares and new_shares parameters can be specified either using the published split ratio (for example, 2-for-1) or the actual number of pre- and post-split shares in your account.

Parameters:
  • sid (str, required) – the sid that underwent a split. There must currently be an open position in this security.

  • old_shares (int, required) – the number of pre-split shares

  • new_shares (int, required) – the number of post-split shares

Returns:

the old and new position for this sid, by account and order ref

Return type:

list

Notes

Usage Guide:

Examples

Record a 2-for-1 split:

>>> record_split("FIBBG12345", old_shares=1, new_shares=2)

Record a 1-for-10 reverse split:

>>> record_split("FIBBG98765", old_shares=10, new_shares=1)
quantrocket.blotter.download_pnl(filepath_or_buffer=None, order_refs=None, accounts=None, sids=None, start_date=None, end_date=None, timezone=None, details=False, output='csv')

Query trading performance and return a CSV of results or PDF tearsheet.

Trading performance is broken down by account and order ref and optionally by sid.

Parameters:
  • filepath_or_buffer (str or file-like object) – filepath to write the data to, or file-like object (defaults to stdout)

  • order_refs (list of str, optional) – limit to these order refs

  • accounts (list of str, optional) – limit to these accounts

  • sids (list of str, optional) – limit to these sids

  • start_date (str (YYYY-MM-DD), optional) – limit to pnl on or after this date

  • end_date (str (YYYY-MM-DD), optional) – limit to pnl on or before this date

  • details (bool) – return detailed results for all securities instead of aggregating to account/order ref level (only supported for a single account and order ref at a time)

  • timezone (str, optional) – return execution times in this timezone (default UTC)

  • output (str, required) – the output format (choices are csv or pdf, default is csv)

Return type:

None

Notes

Usage Guide:

quantrocket.blotter.read_pnl_csv(filepath_or_buffer)

Load a PNL CSV into a DataFrame.

This is a light wrapper around pd.read_csv that handles setting index columns and casting to proper data types.

Parameters:

filepath_or_buffer (string or file-like, required) – path to CSV

Returns:

a multi-index (Field, Date[, Time]) DataFrame of backtest results, with sids or strategy codes as columns

Return type:

DataFrame

Notes

Usage Guide:

Examples

>>> results = read_pnl_csv("pnl.csv")
>>> returns = results.loc["Return"]
 

Blotter API

Resource Group

Orders

Place Orders
POST/blotter/orders

Place one or more orders from a CSV or JSON file. Returns a list of order IDs, which can be used to cancel the orders or check their status.

Example URI

POST http://houston/blotter/orders
Request
Headers
Content-Type: text/csv
Body
Sid,Account,Action,OrderRef,TotalQuantity,Exchange,OrderType,Tif
FI265598,DU123456,BUY,dma-tech,500,SMART,MKT,DAY
FI3691937,DU123456,BUY,dma-tech,50,SMART,MKT,DAY
Request
Headers
Content-Type: application/json
Body
[
  {
    "Sid": "FI265598",
    "Account": "DU123456",
    "Action": "BUY",
    "OrderRef": "dma-tech",
    "TotalQuantity": 500,
    "Exchange": "SMART",
    "OrderType": "MKT",
    "Tif": "DAY"
  },
  {
    "Sid": "FI3691937",
    "Account": "DU123456",
    "Action": "BUY",
    "OrderRef": "dma-tech",
    "TotalQuantity": 50,
    "Exchange": "SMART",
    "OrderType": "MKT",
    "Tif": "DAY"
  }
]
Response  200
Headers
Content-Type: application/json
Body
[
  "6001:25",
  "6001:26"
]

Get Order Status
GET/blotter/orders{output}{?order_ids,sids,order_refs,accounts,open_orders,start_date,end_date,fields}

Download order statuses.

Example URI

GET http://houston/blotter/orders.csv?order_ids=6001:25&sids=FI123456&order_refs=my-strategy&accounts=U12345&open_orders=true&start_date=2018-02-01&end_date=2018-03-01&fields=LmtPrice
URI Parameters
output
str (required) Example: .csv

output format

Choices: .csv .json

order_ids
str (optional) Example: 6001:25

limit to these order IDs (pass multiple times for multiple order IDs)

sids
str (optional) Example: FI123456

limit to orders for these sids (pass multiple times for multiple sids)

order_refs
str (optional) Example: my-strategy

limit to orders for these order refs (pass multiple times for multiple order refs)

accounts
str (optional) Example: U12345

limit to orders for these accounts (pass multiple times for multiple accounts)

open_orders
bool (optional) Example: true

limit to open orders

start_date
str (optional) Example: 2018-02-01

limit to orders submitted on or after this date

end_date
str (optional) Example: 2018-03-01

limit to orders submitted on or before this date

fields
str (optional) Example: LmtPrice

return these fields in addition to the default fields (pass ‘?’ or any invalid fieldname to see available fields) (pass multiple times for multiple fields)

Response  200
Headers
Content-Type: text/csv
Response  200
Headers
Content-Type: application/json

Cancel Orders
DELETE/blotter/orders{?order_ids,sids,order_refs,accounts,cancel_all}

Cancel one or more orders by order ID, sid, or order ref.

Example URI

DELETE http://houston/blotter/orders?order_ids=6001:25&sids=FI123456&order_refs=my-strategy&accounts=U12345&cancel_all=true
URI Parameters
order_ids
str (optional) Example: 6001:25

cancel these order IDs (pass multiple times for multiple order IDs)

sids
str (optional) Example: FI123456

cancel orders for these sids (pass multiple times for multiple sids)

order_refs
str (optional) Example: my-strategy

cancel orders for these order refs (pass multiple times for multiple order refs)

accounts
str (optional) Example: U12345

cancel orders for these accounts (pass multiple times for multiple accounts)

cancel_all
bool (optional) Example: true

cancel all open orders

Response  200
Headers
Content-Type: application/json
Body
{
    "order_ids": [
        "6001:25",
    ],
    "status": "the orders will be canceled asynchronously"
}

Positions

Get Positions
GET/blotter/positions{output}{?sids,order_refs,accounts,view,diff}

Query current positions and write results to file.

There are two ways to view positions: blotter view (default) and broker view.

The default “blotter view” returns positions by account, sid, and order ref. Positions are tracked based on execution records saved to the blotter database.

“Broker view” (view=‘broker’) returns positions by account and sid (but not order ref) as reported directly by the broker.

Example URI

GET http://houston/blotter/positions.csv?sids=FI123456&order_refs=my-strategy&accounts=U12345&view=blotter&diff=false
URI Parameters
output
str (required) Example: .csv

output format

Choices: .csv .json

view
str (optional) Example: blotter

whether to return ‘broker’ view of positions (by account and sid) or default ‘blotter’ view (by account, sid, and order ref)

Choices: blotter broker

sids
str (optional) Example: FI123456

limit to these sids (pass multiple times for multiple sids)

order_refs
str (optional) Example: my-strategy

limit to these order refs (not supported with broker view) (pass multiple times for multiple order refs)

accounts
str (optional) Example: U12345

limit to these accounts (pass multiple times for multiple accounts)

diff
bool (optional) Example: false

limit to positions where the blotter quantity and broker quantity disagree (requires broker view)

Response  200
Headers
Content-Type: text/csv
Response  200
Headers
Content-Type: application/json

Close Positions
DELETE/blotter/positions{output}{?sids,order_refs,accounts,params}

Generate orders to close positions. Doesn’t actually place any orders but returns an orders file that can be placed separately. Additional order parameters can be appended with the params argument.

This endpoint can also be used to generate executions for marking a position as closed due to a tender offer, merger/acquisition, etc.

Example URI

DELETE http://houston/blotter/positions.csv?sids=FI123456&order_refs=my-strategy&accounts=U12345&params=OrderType:MKT
URI Parameters
output
str (required) Example: .csv

output format

Choices: .csv .json

sids
str (optional) Example: FI123456

limit to these sids (pass multiple times for multiple sids)

order_refs
str (optional) Example: my-strategy

limit to these order refs (pass multiple times for multiple order refs)

accounts
str (optional) Example: U12345

limit to these accounts (pass multiple times for multiple accounts)

params
str (optional) Example: OrderType:MKT

additional parameters to append to each row in output (pass as ‘param:value’) (pass multiple times for multiple params)

Response  200
Headers
Content-Type: text/csv
Response  200
Headers
Content-Type: application/json

Split Position
PATCH/blotter/positions{?sid,old_shares,new_shares}

Apply a stock split to an open position.

This endpoint does not interact with the broker but simply applies the split in the blotter database to bring the blotter in line with the broker. The split is also applied to the executions that created the open position, so that PNL calculations will be accurate.

The old_shares and new_shares parameters can be specified either using the published split ratio (for example, 2-for-1) or the actual number of pre- and post-split shares in your account.

Example URI

PATCH http://houston/blotter/positions?sid=FI123456&old_shares=1&new_shares=2
URI Parameters
sid
str (required) Example: FI123456

the sid that underwent a split. There must currently be an open position in this security.

old_shares
int (required) Example: 1

the number of pre-split shares

new_shares
int (required) Example: 2

the number of post-split shares

Response  200
Headers
Content-Type: application/json

Executions

Get Executions
GET/blotter/executions{output}{?sids,order_refs,accounts,start_date,end_date}

Query executions from the executions database.

Example URI

GET http://houston/blotter/executions.csv?sids=FI123456&order_refs=my-strategy&accounts=U12345&start_date=2018-02-01&end_date=2018-03-01
URI Parameters
output
str (required) Example: .csv

output format

Choices: .csv

sids
str (optional) Example: FI123456

limit to these sids (pass multiple times for multiple sids)

order_refs
str (optional) Example: my-strategy

limit to these order refs (pass multiple times for multiple order refs)

accounts
str (optional) Example: U12345

limit to these accounts (pass multiple times for multiple accounts)

start_date
str (optional) Example: 2018-02-01

limit to executions on or after this date

end_date
str (optional) Example: 2018-03-01

limit to executions on or before this date

Response  200
Headers
Content-Type: text/csv

Record Executions
POST/blotter/executions

Record executions that happened outside of QuantRocket’s knowledge.

This endpoint does not interact with the broker but simply adds one or more executions to the blotter database and updates the blotter’s record of current positions accordingly. It can be used to bring the blotter in line with the broker when they differ. For example, when a position is liquidated because of a tender offer or merger/acquisition, you can use this endpoint to record the price received for your shares.

Returns a list of execution IDs inserted into the database.

Example URI

POST http://houston/blotter/executions
Request
Headers
Content-Type: text/csv
Body
Sid,Account,Action,OrderRef,TotalQuantity,Price
FI265598,DU123456,BUY,dma-tech,500,23.50
Request
Headers
Content-Type: application/json
Body
[
  {
    "Sid": "FI265598",
    "Account": "DU123456",
    "Action": "BUY",
    "OrderRef": "dma-tech",
    "TotalQuantity": 500,
    "Price": 23.5
  }
]
Response  200
Headers
Content-Type: application/json
Body
[
  "QR-74deb342-5bef-4b50-b235-110013f81d4d"
]

PNL

Get PNL
GET/blotter/pnl.{output}{?sids,order_refs,accounts,start_date,end_date,timezone,details}

Query trading performance and return a CSV of results or PDF tearsheet. Trading performance is broken down by account and order ref and optionally by sid.

Example URI

GET http://houston/blotter/pnl.csv?sids=FI123456&order_refs=my-strategy&accounts=U12345&start_date=2018-02-01&end_date=2018-03-01&timezone=America/New_York&details=true
URI Parameters
output
str (required) Example: csv

output format

Choices: csv pdf

sids
str (optional) Example: FI123456

limit to these sids (pass multiple times for multiple sids)

order_refs
str (optional) Example: my-strategy

limit to these order refs (pass multiple times for multiple order refs)

accounts
str (optional) Example: U12345

limit to these accounts (pass multiple times for multiple accounts)

start_date
str (optional) Example: 2018-02-01

limit to pnl on or after this date

end_date
str (optional) Example: 2018-03-01

limit to pnl on or before this date

timezone
str (optional) Example: America/New_York

return execution times in this timezone (default UTC)

details
bool (optional) Example: true

return detailed results for all securities instead of aggregating to account/order ref level (only supported for a single account and order ref at a time)

Response  200
Headers
Content-Type: text/csv
Response  200
Headers
Content-Type: application/pdf

quantrocket.codeload

code management service

QuantRocket code management CLI

usage: quantrocket codeload [-h] {clone} ...

subcommands

subcommand

Possible choices: clone

Sub-commands

clone

clone files from a Git repository

quantrocket codeload clone [-h] [-b BRANCH] [-r | -s] [-d TARGET_DIR] repo

Positional Arguments

repo

the name or URL of the repo. Can be the name of a QuantRocket demo repo (e.g. ‘umd’), a GitHub username/repo (e.g. ‘myuser/myrepo’), or the URL of any Git repository

Named Arguments

-b, --branch

the branch to clone (default ‘master’)

-r, --replace

if a file already exists locally, replace it with the remote file (mutually exclusive with –skip-existing)

Default: False

-s, --skip-existing

if a file already exists locally, skip it (mutually exclusive with –replace)

Default: False

-d, --target-dir

the directory into which files should be cloned. Default is ‘/codeload’

Clone files from a Git repository.

Only the files are copied, not the Git metadata. Can be run multiple times to clone files from multiple repositories. Won’t overwrite any existing files unless the –replace option is used.

Notes

Usage Guide:

Examples

Clone QuantRocket’s “umd” demo repository:

quantrocket codeload clone umd

Clone a GitHub repo and skip files that already exist locally:

quantrocket codeload clone myuser/myrepo --skip-existing

Clone a Bitbucket repo:

quantrocket codeload clone https://bitbucket.org/myuser/myrepo.git

Clone a private repo by including username and app password (Bitbucket) or personal access token (GitHub) in the URL:

quantrocket codeload clone https://myuser:myapppassword@bitbucket.org/myuser/myrepo.git
quantrocket.codeload.clone(repo, branch=None, replace=None, skip_existing=None, target_dir=None)

Clone files from a Git repository.

Only the files are copied, not the Git metadata. Can be run multiple times to clone files from multiple repositories. Won’t overwrite any existing files unless replace=True.

Parameters:
  • repo (str, required) – the name or URL of the repo. Can be the name of a QuantRocket demo repo (e.g. ‘umd’), a GitHub username/repo (e.g. ‘myuser/myrepo’), or the URL of any Git repository

  • branch (str, optional) – the branch to clone (default ‘master’)

  • replace (bool, optional) – if a file already exists locally, replace it with the remote file (mutually exclusive with skip_existing)

  • skip_existing (bool, optional) – if a file already exists locally, skip it (mutually exclusive with replace)

  • target_dir (str, optional) – the directory into which files should be cloned. Default is ‘/codeload’

Returns:

status message

Return type:

dict

Notes

Usage Guide:

Examples

Clone QuantRocket’s “umd” demo repository:

>>> clone("umd")

Clone a GitHub repo and skip files that already exist locally:

>>> clone("myuser/myrepo", skip_existing=True)

Clone a Bitbucket repo:

>>> clone("https://bitbucket.org/myuser/myrepo.git")

Clone a private repo by including username and app password (Bitbucket) or personal access token (GitHub) in the URL:

>>> clone("https://myuser:myapppassword@bitbucket.org/myuser/myrepo.git")
 

Codeload API

Resource Group

Repo

Clone Git Repo
POST/repo{?repo,branch,replace,skip_existing}

Clone files from a Git repository.

Only the files are copied, not the Git metadata. Can be run multiple times to clone files from multiple repositories. Won’t overwrite any existing files unless replace=True.

Example URI

POST http://houston/repo?repo=umd&branch=master&replace=true&skip_existing=false
URI Parameters
repo
str (required) Example: umd

the name or URL of the repo. Can be the name of a QuantRocket demo repo (e.g. ‘umd’), a GitHub username/repo (e.g. ‘myuser/myrepo’), or the URL of any Git repository

branch
str (optional) Example: master

the branch to clone (default ‘master’)

replace
bool (optional) Example: true

if a file already exists locally, replace it with the remote file (mutually exclusive with skip_existing)

skip_existing
bool (optional) Example: false

if a file already exists locally, skip it (mutually exclusive with replace)

Response  200
Headers
Content-Type: application/json

quantrocket.countdown

cron scheduler service

QuantRocket cron service CLI

usage: quantrocket countdown [-h] {crontab,timezone} ...

subcommands

subcommand

Possible choices: crontab, timezone

Sub-commands

crontab

upload a new crontab, or return the current crontab

quantrocket countdown crontab [-h] [-s SERVICE_NAME] [FILENAME]

Positional Arguments

FILENAME

the crontab file to upload (if omitted, return the current crontab)

Named Arguments

-s, --service

the name of the countdown service (default ‘countdown’)

Upload a new crontab, or return the current crontab.

Notes

Usage Guide:

Examples

Upload a new crontab to a service called countdown-australia (replaces current crontab):

quantrocket countdown crontab mycron.crontab -s countdown-australia

Show current crontab for a service called countdown-australia:

quantrocket countdown crontab -s countdown-australia

timezone

set or show the countdown service timezone

quantrocket countdown timezone [-h] [-s SERVICE_NAME] [TZ]

Positional Arguments

TZ

the timezone to set (pass a partial timezone string such as ‘newyork’ or ‘europe’ to see close matches, or pass ‘?’ to see all choices)

Named Arguments

-s, --service

the name of the countdown service, (default ‘countdown’)

Set or show the countdown service timezone.

Notes

Usage Guide:

Examples

Set the timezone of the countdown service to America/New_York:

quantrocket countdown timezone America/New_York

Show the current timezone of the countdown service:

quantrocket countdown timezone

Show the timezone for a service called countdown-australia:

quantrocket countdown timezone -s countdown-australia
quantrocket.countdown.get_timezone(service=None)

Return the countdown service timezone.

Parameters:

service (str, optional) – the name of the countdown service (default ‘countdown’)

Returns:

dict with key timezone

Return type:

dict

Notes

Usage Guide:

quantrocket.countdown.set_timezone(tz, service=None)

Set the countdown service timezone.

Parameters:
  • tz (str, required) – the timezone to set (pass a partial timezone string such as ‘newyork’ or ‘europe’ to see close matches, or pass ‘?’ to see all choices)

  • service (str, optional) – the name of the countdown service (default ‘countdown’)

Returns:

status message

Return type:

dict

Notes

Usage Guide:

Examples

Set the countdown timezone to America/New_York:

>>> set_timezone("America/New_York")
 

Countdown API

Resource Group

Crontab

Get Crontab
GET/{service}/crontab

Returns the service crontab.

Example URI

GET http://houston/countdown-usa/crontab
URI Parameters
service
str (required) Example: countdown-usa

The name of the countdown-service, e.g. countdown-usa

Response  200
Headers
Content-Type: text/plain
Body
30 9 1-31 1-12 1-5 curl -X POST https://houston/api/endpoint1
30 17 1-31 1-12 1-5 curl -X POST https://houston/api/endpoint2

Load Crontab
PUT/{service}/crontab

Uploads a new crontab.

Example URI

PUT http://houston/countdown-usa/crontab
URI Parameters
service
str (required) Example: countdown-usa

The name of the countdown-service, e.g. countdown-usa

Request
Headers
Content-Type: text/plain
Body
30 9 1-31 1-12 1-5 curl -X POST https://houston/api/endpoint1
30 17 1-31 1-12 1-5 curl -X POST https://houston/api/endpoint2
0 12 1-31 1-12 1-7 curl -X POST https://houston/api/endpoint3
Response  202
Headers
Content-Type: application/json
Body
{
  "status": "the crontab will be loaded asynchronously"
}

Timezone

Get Timezone
GET/{service}/timezone

Returns the service timezone.

Example URI

GET http://houston/countdown-usa/timezone
URI Parameters
service
str (required) Example: countdown-usa

The name of the countdown-service, e.g. countdown-usa

Response  200
Headers
Content-Type: application/json
Body
{
  "timezone": "America/New_York"
}

Set Timezone
PUT/{service}/timezone{?tz}

Sets the service timezone.

Example URI

PUT http://houston/countdown-usa/timezone?tz=America/New_York
URI Parameters
service
str (required) Example: countdown-usa

The name of the countdown-service, e.g. countdown-usa

tz
str (required) Example: America/New_York

The timezone to set.

Response  200
Headers
Content-Type: application/json

quantrocket.db

database management service

QuantRocket database service CLI

usage: quantrocket db [-h] {list,s3config,s3push,s3pull,optimize} ...

subcommands

subcommand

Possible choices: list, s3config, s3push, s3pull, optimize

Sub-commands

list

list databases

quantrocket db list [-h] [-s [SERVICE ...]] [-c [DATABASE_CODE ...]] [-d] [-e]

Named Arguments

-s, --services

limit to these services

-c, --codes

limit to these codes

-d, --detail

return database statistics (default is to return a flat list of database names)

Default: False

-e, --expand

expand sharded databases to include individual shards (default is to list sharded databases as a single database)

Default: False

List databases.

Notes

Usage Guide:

Examples

List all databases:

quantrocket db list

List all history databases and include details such as file size:

quantrocket db list --services history --detail

List details for a sharded history database called usa-stk-15min and list each shard individually:

quantrocket db list --services history --codes usa-stk-15min --detail --expand

s3config

set or show Amazon S3 configuration for pushing and pulling databases to and from S3

quantrocket db s3config [-h] [-a ACCESS_KEY_ID] [-s SECRET_ACCESS_KEY]
                        [-b BUCKET] [-r REGION]

Named Arguments

-a, --access-key-id

AWS access key ID

-s, --secret-access-key

AWS secret access key (if omitted and access-key-id is provided, will be prompted for secret-access-key)

-b, --bucket

the S3 bucket name to push to/pull from

-r, --region

the AWS region in which to create the bucket (default us-east-1). Ignored if the bucket already exists.

Set or show Amazon S3 configuration for pushing and pulling databases to and from S3.

Credentials are encrypted at rest and never leave your deployment.

Notes

Usage Guide:

Examples

Configure S3 (will prompt for secret access key):

quantrocket db s3config --access-key-id XXXXXXXX --bucket my-bucket

Preserve existing credentials but point to a new bucket:

quantrocket db s3config --bucket my-other-bucket

Show current configuration:

quantrocket db s3config

s3push

push database(s) to Amazon S3

quantrocket db s3push [-h] [-s [SERVICE ...]] [-c [DATABASE_CODE ...]]

Named Arguments

-s, --services

limit to these services

-c, --codes

limit to these codes

Push database(s) to Amazon S3.

Notes

Usage Guide:

Examples

Push all databases:

quantrocket db s3push

Push all databases for the history service:

quantrocket db s3push --services history

Push a database called quantrocket.history.nyse.sqlite:

quantrocket db s3push --services history --codes nyse

s3pull

pull database(s) from Amazon S3

quantrocket db s3pull [-h] [-s [SERVICE ...]] [-c [DATABASE_CODE ...]] [-f]

Named Arguments

-s, --services

limit to these services

-c, --codes

limit to these codes

-f, --force

overwrite existing database if one exists (default is to fail if one exists)

Default: False

Pull database(s) from Amazon S3 to the db service.

Notes

Usage Guide:

Examples

Pull a database stored on S3 as quantrocket.history.nyse.sqlite.gz:

quantrocket db s3pull --services history --codes nyse

optimize

optimize databases to improve performance

quantrocket db optimize [-h] [-s [SERVICE ...]] [-c [DATABASE_CODE ...]]

Named Arguments

-s, --services

limit to these services

-c, --codes

limit to these codes

Optimize databases to improve performance.

This runs the ‘VACUUM’ command, which defragments the database and reclaims disk space.

Notes

Usage Guide:

Examples

Optimize all blotter databases:

quantrocket db optimize --services blotter
quantrocket.db.list_databases(services=None, codes=None, detail=False, expand=False)

List databases.

Parameters:
  • services (list of str, optional) – limit to these services

  • codes (list of str, optional) – limit to these codes

  • detail (bool) – return database statistics (default is to return a flat list of database names)

  • expand (bool) – expand sharded databases to include individual shards (default is to list sharded databases as a single database)

Returns:

dict of lists of databases (one key for PostgreSQL databases and one for SQLite databases)

Return type:

dict

Notes

Usage Guide:

Examples

Load database details in a pandas DataFrame:

>>> from quantrocket.db import list_databases
>>> import itertools
>>> databases = list_databases(detail=True)
>>> databases = pd.DataFrame.from_records(itertools.chain(databases["sqlite"], databases["postgres"]))
quantrocket.db.get_s3_config()

Return the current S3 configuration, if any.

Returns:

configuration details

Return type:

dict

Notes

Usage Guide:

quantrocket.db.set_s3_config(access_key_id=None, secret_access_key=None, bucket=None, region=None)

Set AWS S3 configuration for pushing and pulling databases to and from S3.

Credentials are encrypted at rest and never leave your deployment.

Parameters:
  • access_key_id (str, optional) – AWS access key ID

  • secret_access_key (str, optional) – AWS secret access key (if omitted and access_key_id is provided, will be prompted for secret_access_key)

  • bucket (str, optional) – the S3 bucket name to push to/pull from

  • region (str, optional) – the AWS region in which to create the bucket (default us-east-1). Ignored if the bucket already exists.

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.db.s3_push_databases(services=None, codes=None)

Push database(s) to Amazon S3.

Parameters:
  • serivces (list of str, optional) – limit to these services

  • codes (list of str, optional) – limit to these codes

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.db.s3_pull_databases(services=None, codes=None, force=False)

Pull database(s) from Amazon S3.

Parameters:
  • serivces (list of str, optional) – limit to these services

  • codes (list of str, optional) – limit to these codes

  • force (bool) – overwrite existing database if one exists (default is to fail if one exists)

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.db.optimize_databases(services=None, codes=None)

Optimize databases to improve performance.

This runs the ‘VACUUM’ command, which defragments the database and reclaims disk space.

Parameters:
  • serivces (list of str, optional) – limit to these service

  • codes (list of str, optional) – limit to these codes

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.db.connect_sqlite(db_path)

Return a connection to a SQLite database.

Parameters:

db_path (str, required) – full path to a SQLite database

Returns:

database connection

Return type:

sqlalchemy.engine.Engine

Notes

Usage Guide:

quantrocket.db.insert_or_fail(df, table_name, conn)

Insert a DataFrame into a SQLite database.

In the case of a duplicate record insertion, the function will fail.

Parameters:
  • df (DataFrame, required) – the DataFrame to insert. All DataFrame columns must exist in the destination table. The DataFrame index will not be inserted.

  • table_name (str, required) – the name of the table to insert the DataFrame into. The table must already exist in the database.

  • conn (sqlalchemy.engine.Engine, required) – a connection object for the SQLite database

Return type:

None

Raises:

quantrocket.exceptions.DataInsertionError – catch-all exception class for errors that occur when writing to the SQLite database

Notes

Usage Guide:

quantrocket.db.insert_or_replace(df, table_name, conn)

Insert a DataFrame into a SQLite database.

In the case of a duplicate record insertion, the incoming record will replace the existing record.

Parameters:
  • df (DataFrame, required) – the DataFrame to insert. All DataFrame columns must exist in the destination table. The DataFrame index will not be inserted.

  • table_name (str, required) – the name of the table to insert the DataFrame into. The table must already exist in the database.

  • conn (sqlalchemy.engine.Engine, required) – a connection object for the SQLite database

Return type:

None

Raises:

quantrocket.exceptions.DataInsertionError – catch-all exception class for errors that occur when writing to the SQLite database

Notes

Usage Guide:

quantrocket.db.insert_or_ignore(df, table_name, conn)

Insert a DataFrame into a SQLite database.

In the case of a duplicate record insertion, the incoming record will be ignored.

Parameters:
  • df (DataFrame, required) – the DataFrame to insert. All DataFrame columns must exist in the destination table. The DataFrame index will not be inserted.

  • table_name (str, required) – the name of the table to insert the DataFrame into. The table must already exist in the database.

  • conn (sqlalchemy.engine.Engine, required) – a connection object for the SQLite database

Return type:

None

Raises:

quantrocket.exceptions.DataInsertionError – catch-all exception class for errors that occur when writing to the SQLite database

Notes

Usage Guide:

 

DB API

Resource Group

Database List

List Databases
GET/db/databases{?services,codes,detail,expand}

List databases.

Example URI

GET http://houston/db/databases?services=history&codes=usa-stk-1min&detail=true&expand=true
URI Parameters
services
str (optional) Example: history

limit to these services (pass multiple times for multiple services)

codes
str (optional) Example: usa-stk-1min

limit to these codes (omit to list all databases for service) (pass multiple times for multiple codes)

detail
bool (optional) Example: true

return database statistics (default is to return a flat list of database names). Currently only supported for SQLite databases.

expand
bool (optional) Example: true

expand sharded databases to include individual shards (default is to list sharded databases as a single database)

Response  200
Headers
Content-Type: application/json
Body
{
    "sqlite": [
        "quantrocket.history.canada.sqlite",
        "quantrocket.history.australia_eod.sqlite",
    ],
    "postgres": []
}

S3 Config

Set S3 configuration
PUT/s3config{?access_key_id,secret_access_key,bucket,region}

Set AWS S3 configuration for pushing and pulling databases to and from S3.

See http://qrok.it/h/dbs3 to learn more.

Credentials are encrypted at rest and never leave your deployment.

Example URI

PUT http://houston/s3config?access_key_id=XXXXXXXXX&secret_access_key=XXXXXXXXX&bucket=mybucket&region=us-east-1
URI Parameters
access_key_id
str (optional) Example: XXXXXXXXX

AWS access key ID

secret_access_key
str (optional) Example: XXXXXXXXX

AWS secret access key

bucket
str (optional) Example: mybucket

the S3 bucket name to push to/pull from

region
str (optional) Example: us-east-1

the AWS region in which to create the bucket (default us-east-1). Ignored if the bucket already exists.

Response  200
Headers
Content-Type: application/json

Get S3 configuration
GET/s3config

Return the current S3 configuration, if any.

Example URI

GET http://houston/s3config
Response  200
Headers
Content-Type: application/json
Body
{
    "AWS_ACCESS_KEY_ID": "XXXXXXXXX",
    "S3_BUCKET": "mybucket",
}

S3

Pull from S3
GET/db/s3{?services,codes,force}

Pull database(s) from Amazon S3.

Example URI

GET http://houston/db/s3?services=history&codes=canada&force=true
URI Parameters
services
str (required) Example: history

limit to these services (pass multiple times for multiple services)

codes
str (optional) Example: canada

limit to these codes (pass multiple times for multiple codes)

force
bool (optional) Example: true

overwrite existing database if one exists (default is to fail if one exists)

Response  202
Headers
Content-Type: application/json
Body
{
  "status": "the databases will be pulled from S3 asynchronously"
}

Push to S3
PUT/db/s3{?services,codes}

Push database(s) to Amazon S3.

Example URI

PUT http://houston/db/s3?services=history&codes=canada
URI Parameters
services
str (required) Example: history

limit to these services (pass multiple times for multiple services)

codes
str (optional) Example: canada

limit to these codes (pass multiple times for multiple codes)

Response  202
Headers
Content-Type: application/json
Body
{
  "status": "the databases will be pushed to S3 asynchronously"
}

Optimizations

Optimize Databases
POST/db/optimizations{?services,codes}

Optimize database file(s) to improve performance.

Example URI

POST http://houston/db/optimizations?services=history&codes=canada
URI Parameters
services
str (required) Example: history

limit to these services (pass multiple times for multiple services)

codes
str (optional) Example: canada

limit to these codes (pass multiple times for multiple codes)

Response  202
Headers
Content-Type: application/json
Body
{
  "status": "the databases will be optimized asynchronously"
}

quantrocket.flightlog

logging service

QuantRocket logging service CLI

usage: quantrocket flightlog [-h]
                             {stream,get,wait,log,timezone,papertrail} ...

subcommands

subcommand

Possible choices: stream, get, wait, log, timezone, papertrail

Sub-commands

stream

stream application logs, tail -f style

quantrocket flightlog stream [-h] [-d] [--hist NUM_LINES] [--nocolor]

Named Arguments

-d, --detail

show detailed logs from logspout, otherwise show log messages from flightlog only

Default: False

--hist

number of log lines to show right away (ignored if showing detailed logs)

--nocolor

don’t colorize the logs

Default: True

Stream application logs, tail -f style.

Notes

Usage Guide:

Examples

Stream application logs:

quantrocket flightlog stream

Stream detailed logs:

quantrocket flightlog stream --detail

get

download the logfile

quantrocket flightlog get [-h] [-d] [-m PATTERN] OUTFILE

Positional Arguments

OUTFILE

filename to write the logfile to

Named Arguments

-d, --detail

download detailed logs from the logspout service, otherwise download the standard logs from the flightlog service

Default: False

-m, --match

filter the logfile to lines containing this string

Download the logfile.

Notes

Usage Guide:

Examples

Download application logs:

quantrocket flightlog get app.log

Download detailed logs:

quantrocket flightlog get --detail sys.log

Download detailed logs for the history service:

quantrocket flightlog get --detail --match quantrocket_history sys.log

wait

wait for a message to appear in the logs

quantrocket flightlog wait [-h] [-r] [-d] [--tail TAIL] [--timeout TIMEOUT]
                           message

Positional Arguments

message

the log message to search for

Named Arguments

-r, --regex

if True, treat the message argument as a regular expression (default is to treat it as a plain string)

Default: False

-d, --detail

if True, search the detailed logs from the logspout service (default is to search the standard logs from the flightlog service)

Default: False

--tail

search the most recent N lines of the logs in addition to searching future logs (default is to only search future logs)

--timeout

fail if the message is not found after this much time (use Pandas timedelta string, e.g. 30sec or 5min or 2h; default is to wait indefinitely)

Wait for a message to appear in the logs.

Searches can be performed against the standard or detailed log file. When searching the detailed logs, note that the log file uses the syslog format, which differs from the format used when streaming detailed logs. Download the detailed log file to see the exact format your search will run against.

Notes

Usage Guide:

Examples

Wait up to 10 minutes for a message to appear indicating that data ingestion has finished:

quantrocket flightlog wait '[usstock-1min] Completed ingesting data' --timeout 10m

Using a regular expression, wait up to 1 hour for a message to appear indicating that data collection has finished:

quantrocket flightlog wait '\[usstock-1d\] Collected [0-9]+ monthly files' --regex --timeout 1h

log

log a message

quantrocket flightlog log [-h] [-l LEVEL] [-n LOGGER_NAME] [msg]

Positional Arguments

msg

the message to be logged

Default: “-”

Named Arguments

-l, --level

Possible choices: DEBUG, INFO, WARNING, ERROR, CRITICAL

the log level for the message. Possible choices: (‘DEBUG’, ‘INFO’, ‘WARNING’, ‘ERROR’, ‘CRITICAL’)

Default: “INFO”

-n, --name

the logger name

Default: “quantrocket._cli”

Log a message.

Notes

Usage Guide:

Examples

Log a message under the name “myapp”:

quantrocket flightlog log "this is a test" --name myapp --level INFO

Log the output from another command:

quantrocket account balance --below-cushion 0.02 | quantrocket flightlog log --name quantrocket.account --level CRITICAL

timezone

set or show the flightlog timezone

quantrocket flightlog timezone [-h] [TZ]

Positional Arguments

TZ

the timezone to set (pass a partial timezone string such as ‘newyork’ or ‘europe’ to see close matches, or pass ‘?’ to see all choices)

Set or show the flightlog timezone.

Notes

Usage Guide:

Examples

Set the flightlog timezone to America/New_York:

quantrocket flightlog timezone America/New_York

Show the current flightlog timezone:

quantrocket flightlog timezone

papertrail

set or show the Papertrail log configuration

quantrocket flightlog papertrail [-h] [--host HOST] [--port PORT]

Named Arguments

--host

the Papertrail host to log to

--port

the Papertrail port to log to

Set or show the Papertrail log configuration.

Notes

Usage Guide:

Examples

Set the Papertrail host and port to log to:

quantrocket flightlog papertrail --host logs.papertrailapp.com --port 55555

Show the current papertrail config:

quantrocket flightlog papertrail
quantrocket.flightlog.FlightlogHandler(background=None)

Return a log handler that logs to flightlog.

Parameters:

background (bool) – If True, causes logging to happen in a background thread so that logging doesn’t block. Background logging requires Python 3.2 or higher, and defaults to True for supported versions and False otherwise.

Return type:

logging.handlers.QueueHandler or quantrocket.flightlog._ImpatientHttpHandler

Notes

Usage Guide:

Examples

Log a message using the FlightlogHandler:

>>> import logging
>>> from quantrocket.flightlog import FlightlogHandler
>>> logger = logging.getLogger('myapp')
>>> logger.setLevel(logging.DEBUG)
>>> handler = FlightlogHandler()
>>> logger.addHandler(handler)
>>> logger.info('my app just opened a position')
quantrocket.flightlog.stream_logs(detail=False, hist=None, color=True)

Stream application logs, tail -f style.

Parameters:
  • detail (bool) – if True, show detailed logs from logspout, otherwise show log messages from flightlog only (default False)

  • hist (int, optional) – number of log lines to show right away (ignored if showing detailed logs)

  • color (bool) – colorize the logs

Yields:

str – each log line as it arrives

Notes

Usage Guide:

quantrocket.flightlog.download_logfile(outfile, detail=False, match=None)

Download the logfile.

Parameters:
  • outfile (str or file-like object, required) – filename or file object to write the logfile to

  • detail (bool) – download detailed logs from the logspout service, otherwise download the standard logs from the flightlog service

  • match (str, optional) – filter the logfile to lines containing this string

Return type:

None

Notes

Usage Guide:

quantrocket.flightlog.wait_for_message(message, regex=False, detail=False, tail=0, timeout=None)

Wait for a message to appear in the logs.

Searches can be performed against the standard or detailed log file. When searching the detailed logs, note that the log file uses the syslog format, which differs from the format used when streaming detailed logs. Download the detailed log file to see the exact format your search will run against.

Parameters:
  • message (str, required) – the log message to search for

  • regex (bool) – if True, treat the message argument as a regular expression (default is to treat it as a plain string)

  • detail (bool, optional) – if True, search the detailed logs from the logspout service (default is to search the standard logs from the flightlog service)

  • tail (int, optional) – search the most recent N lines of the logs in addition to searching future logs (default is to only search future logs)

  • timeout (str, optional) – fail if the message is not found after this much time (use Pandas timedelta string, e.g. 30sec or 5min or 2h; default is to wait indefinitely)

Returns:

status dict containing the matching log line

Return type:

dict

Notes

Usage Guide:

Examples

Wait up to 10 minutes for a message to appear indicating that data ingestion has finished:

>>> wait_for_message('[usstock-1min] Completed ingesting data', timeout='10m')

Using a regular expression, wait up to 1 hour for a message to appear indicating that data collection has finished:

>>> wait_for_message(r'\[usstock-1d\] Collected [0-9]+ monthly files', regex=True, timeout='1h')
quantrocket.flightlog.get_timezone()

Return the flightlog timezone.

Returns:

dict with key timezone

Return type:

dict

Notes

Usage Guide:

quantrocket.flightlog.set_timezone(tz)

Set the flightlog timezone.

Parameters:

tz (str, required) – the timezone to set (pass a partial timezone string such as ‘newyork’ or ‘europe’ to see close matches, or pass ‘?’ to see all choices)

Returns:

status message

Return type:

dict

Notes

Usage Guide:

Examples

Set the flightlog timezone to America/New_York:

>>> set_timezone("America/New_York")
quantrocket.flightlog.get_papertrail_config()

Return the current Papertrail log configuration, if any.

Returns:

config details

Return type:

dict

Notes

Usage Guide:

quantrocket.flightlog.set_papertrail_config(host, port)

Set the Papertrail log configuration.

Parameters:
  • host (str, required) – the Papertrail host to log to

  • port (int, required) – the Papertrail port to log to

Returns:

status message

Return type:

dict

Notes

Usage Guide:

Examples

Configure flightlog to log to Papertrail:

>>> set_papertrail_config("logs.papertrailapp.com", 55555)
 

Flightlog API

Resource Group

Logfiles

Download log files
GET/flightlog/logfile/{logtype}{?match}

Download the logfile.

Example URI

GET http://houston/flightlog/logfile/app?match=history
URI Parameters
logtype
str (required) Example: app

the type of logfile to download

Choices: app system

match
str (optional) Example: history

filter the logfile to lines containing this string

Response  200
Headers
Content-Type: text/plain
Body
2020-01-18 10:19:31 quantrocket.flightlog: INFO Detected a change in flightlog configs directory, reloading configs...
2020-01-18 10:19:31 quantrocket.flightlog: INFO Successfully loaded config
2020-01-18 14:25:57 quantrocket.master: INFO Requesting contract details for error 200 symbols

Log messages

Wait for message
GET/flightlog/messages/{message}{?regex,detail,tail,timeout}

Wait for a message to appear in the logs.

Searches can be performed against the standard or detailed log file. When searching the detailed logs, note that the log file uses the syslog format, which differs from the format used when streaming detailed logs. Download the detailed log file to see the exact format your search will run against.

Example URI

GET http://houston/flightlog/messages/[usstock-1min] Completed ingesting data?regex=False&detail=False&tail=10&timeout=2m
URI Parameters
message
str (required) Example: [usstock-1min] Completed ingesting data

the log message to search for

regex
bool (optional) Example: False

if True, treat the message argument as a regular expression (default is to treat it as a plain string)

detail
bool (optional) Example: False

if True, search the detailed logs from the logspout service (default is to search the standard logs from the flightlog service)

tail
int (optional) Example: 10

search the most recent N lines of the logs in addition to searching future logs (default is to only search future logs)

timeout
str (optional) Example: 2m

fail if the message is not found after this much time (use Pandas timedelta string, e.g. 30sec or 5min or 2h; default is to wait indefinitely)

Response  200
Headers
Content-Type: text/plain
Body
{
  "status": "success",
  "match": "2020-10-01 08:09:09 quantrocket.zipline: INFO [usstock-1min] Completed ingesting data for 8961 securities in usstock-1min bundle"
}

Log Stream

Stream logs
GET/flightlog/stream/logs{?hist,nocolor}

Stream application logs, tail -f style.

Example URI

GET http://houston/flightlog/stream/logs?hist=5&nocolor=
URI Parameters
hist
int (optional) Example: 5

number of log lines to show right away

nocolor
str (optional) 

don’t colorize the logs if nocolor parameter is passed (parameter value doesn’t matter)

Response  200
Headers
Content-Type: text/plain
Transfer-Encoding: chunked
Body
2020-01-18 10:19:31 quantrocket.flightlog: INFO Successfully loaded config
2020-01-18 14:25:57 quantrocket.master: INFO Requesting contract details for error 200 symbols

Logspout Stream

Stream logspout
GET/logspout/logs{?colors}

Stream detailed logs from logspout, tail -f style.

Example URI

GET http://houston/logspout/logs?colors=off
URI Parameters
colors
str (optional) Example: off

don’t colorize the logs if colors=off (colorized by default)

Response  200
Headers
Content-Type: text/plain
Transfer-Encoding: chunked
Body
quantrocket_houston_1|172.21.0.1 - - [18/Jan/2020:10:14:48 +0000] "POST /flightlog/handler HTTP/1.1" 200 5 "-" "-"
2020-01-18 10:19:31 quantrocket.flightlog: INFO Detected a change in flightlog configs directory, reloading configs...
2020-01-18 10:19:31 quantrocket.flightlog: INFO Successfully loaded config
2020-01-18 14:25:57 quantrocket.master: INFO Requesting contract details for error 200 symbols
test_houston_1|2020/01/18 20:59:01 [error] 5#5: *17137 open() "/usr/local/openresty/nginx/html/invalidpath" failed (2: No such file or directory), client: 172.20.0.8, server: localhost, request: "GET /invalidpath HTTP/1.1", host: "houston"

Timezone

Get Timezone
GET/timezone

Returns the flightlog timezone.

Example URI

GET http://houston/timezone
Response  200
Headers
Content-Type: application/json
Body
{
  "timezone": "America/New_York"
}

Set Timezone
PUT/timezone{?tz}

Sets the timezone.

Example URI

PUT http://houston/timezone?tz=America/New_York
URI Parameters
tz
str (required) Example: America/New_York

The timezone to set.

Response  200
Headers
Content-Type: application/json

Papertrail

Get Papertrail configuration
GET/papertrail

Return the current Papertrail log configuration, if any.

Example URI

GET http://houston/papertrail
Response  200
Headers
Content-Type: application/json
Body
{
    "PAPERTRAIL_HOST": "logs.papertrailapp.com",
    "PAPERTRAIL_PORT": 55555,
}

Set Papertrail configuration
PUT/papertrail{?host,port}

Set the Papertrail log configuration.

Example URI

PUT http://houston/papertrail?host=logs.papertrailapp.com&port=55555
URI Parameters
host
str (required) Example: logs.papertrailapp.com

the Papertrail host to log to

port
int (required) Example: 55555

the Papertrail port to log to

Response  200
Headers
Content-Type: application/json

quantrocket.fundamental

fundamental data service

QuantRocket fundamental data CLI

usage: quantrocket fundamental [-h]
                               {collect-sharadar-fundamentals,collect-sharadar-insiders,collect-sharadar-institutions,collect-sharadar-sec8,collect-sharadar-sp500,collect-ibkr-shortshares,collect-ibkr-borrowfees,collect-ibkr-margin,collect-alpaca-etb,sharadar-fundamentals,sharadar-insiders,sharadar-institutions,sharadar-sec8,sharadar-sp500,ibkr-shortshares,ibkr-borrowfees,ibkr-margin,alpaca-etb,collect-wsh,reuters-financials,reuters-estimates,wsh}
                               ...

subcommands

subcommand

Possible choices: collect-sharadar-fundamentals, collect-sharadar-insiders, collect-sharadar-institutions, collect-sharadar-sec8, collect-sharadar-sp500, collect-ibkr-shortshares, collect-ibkr-borrowfees, collect-ibkr-margin, collect-alpaca-etb, sharadar-fundamentals, sharadar-insiders, sharadar-institutions, sharadar-sec8, sharadar-sp500, ibkr-shortshares, ibkr-borrowfees, ibkr-margin, alpaca-etb, collect-wsh, reuters-financials, reuters-estimates, wsh

Sub-commands

collect-sharadar-fundamentals

collect fundamental data from Sharadar and save to database

quantrocket fundamental collect-sharadar-fundamentals [-h] [-c COUNTRY]

Named Arguments

-c, --country

Possible choices: US, FREE

country to collect fundamentals for. Possible choices: [‘US’, ‘FREE’]

Default: “US”

Collect fundamental data from Sharadar and save to database.

Notes

Usage Guide:

Examples

quantrocket fundamental collect-sharadar-fundamentals

collect-sharadar-insiders

collect insider holdings data from Sharadar and save to database

quantrocket fundamental collect-sharadar-insiders [-h] [-c COUNTRY]

Named Arguments

-c, --country

Possible choices: US, FREE

country to collect insider holdings data for. Possible choices: [‘US’, ‘FREE’]

Default: “US”

Collect insider holdings data from Sharadar and save to database.

Notes

Usage Guide:

Examples

quantrocket fundamental collect-sharadar-insiders

collect-sharadar-institutions

collect institutional investor data from Sharadar and save to database

quantrocket fundamental collect-sharadar-institutions [-h] [-c COUNTRY] [-d]

Named Arguments

-c, --country

Possible choices: US, FREE

country to collect institutional investor data for. Possible choices: [‘US’, ‘FREE’]

Default: “US”

-d, --detail

collect detailed investor data (separate record per investor per security per quarter). If omitted, collect data aggregated by security (separate record per security per quarter)

Default: False

Collect institutional investor data from Sharadar and save to database.

Notes

Usage Guide:

Examples

Collect institutional investor data aggregated by security:

quantrocket fundamental collect-sharadar-institutions

Collect detailed institutional investor data (not aggregated by security):

quantrocket fundamental collect-sharadar-institutions -d

collect-sharadar-sec8

collect SEC Form 8-K events from Sharadar and save to database

quantrocket fundamental collect-sharadar-sec8 [-h] [-c COUNTRY]

Named Arguments

-c, --country

Possible choices: US, FREE

country to collect events data for. Possible choices: [‘US’, ‘FREE’]

Default: “US”

Collect SEC Form 8-K events from Sharadar and save to database.

Notes

Usage Guide:

Examples

quantrocket fundamental collect-sharadar-sec8

collect-sharadar-sp500

collect historical S&P 500 index constituents from Sharadar and save to database

quantrocket fundamental collect-sharadar-sp500 [-h] [-c COUNTRY]

Named Arguments

-c, --country

Possible choices: US, FREE

country to collect S&P 500 constituents data for. Possible choices: [‘US’, ‘FREE’]

Default: “US”

Collect historical S&P 500 index constituents from Sharadar and save to database.

Notes

Usage Guide:

Examples

quantrocket fundamental collect-sharadar-sp500

collect-ibkr-shortshares

collect Interactive Brokers shortable shares data and save to database

quantrocket fundamental collect-ibkr-shortshares [-h] [-c [COUNTRY ...]]

Named Arguments

-c, --countries

limit to these countries (pass ‘?’ or any invalid country to see available countries)

Collect Interactive Brokers shortable shares data and save to database.

Data is organized by country and updated every 15 minutes. Historical data is available from April 15, 2018. Detailed intraday data as well as aggregated daily data will be saved to the database.

Notes

Usage Guide:

Examples

Collect shortable shares data for US stocks:

quantrocket fundamental collect-ibkr-shortshares --countries usa

Collect shortable shares data for all stocks:

quantrocket fundamental collect-ibkr-shortshares

collect-ibkr-borrowfees

collect Interactive Brokers borrow fees data and save to database

quantrocket fundamental collect-ibkr-borrowfees [-h] [-c [COUNTRY ...]]

Named Arguments

-c, --countries

limit to these countries (pass ‘?’ or any invalid country to see available countries)

Collect Interactive Brokers borrow fees data and save to database.

Data is organized by country. Historical data is available from April 2018.

Notes

Usage Guide:

Examples

Collect borrow fees for US stocks:

quantrocket fundamental collect-ibkr-borrowfees --countries usa

Collect borrow fees for all stocks:

quantrocket fundamental collect-ibkr-borrowfees

collect-ibkr-margin

collect Interactive Brokers margin requirements data and save to database

quantrocket fundamental collect-ibkr-margin [-h] -c COUNTRY

Named Arguments

-c, --country

the country of the IBKR subsidiary where your account is located (pass ‘?’ or any invalid country to see available countries)

Collect Interactive Brokers margin requirements data and save to database.

The country parameter refers to the country of the IBKR subsidiary where your account is located. (Margin requirements vary by IBKR subsidiary.) Note that this differs from the IBKR shortable shares or borrow fees APIs, where the countries parameter refers to the country of the security rather than the country of the account.

Historical data is available from April 2018.

Notes

Usage Guide:

Examples

Collect margin requirements for a US-based account:

quantrocket fundamental collect-ibkr-margin --country usa

collect-alpaca-etb

collect Alpaca easy-to-borrow data and save to database

quantrocket fundamental collect-alpaca-etb [-h]

Collect Alpaca easy-to-borrow data and save to database.

Data is updated daily. Historical data is available from March 2019.

Notes

Usage Guide:

Examples

Collect easy-to-borrow data:

quantrocket fundamental collect-alpaca-etb

sharadar-fundamentals

query Sharadar Fundamentals from the local database and download to file

quantrocket fundamental sharadar-fundamentals [-h] [-s YYYY-MM-DD]
                                              [-e YYYY-MM-DD]
                                              [-u [UNIVERSE ...]]
                                              [-i [SID ...]]
                                              [--exclude-universes [UNIVERSE ...]]
                                              [--exclude-sids [SID ...]]
                                              [-m [{ARQ,ARY,ART,MRQ,MRY,MRT} ...]]
                                              [-o OUTFILE] [-j]
                                              [-f [FIELD ...]]

filtering options

-s, --start-date

limit to fundamentals on or after this fiscal period end date

-e, --end-date

limit to fundamentals on or before this fiscal period end date

-u, --universes

limit to these universes

-i, --sids

limit to these sids

--exclude-universes

exclude these universes

--exclude-sids

exclude these sids

-m, --dimensions

Possible choices: ARQ, ARY, ART, MRQ, MRY, MRT

limit to these dimensions. Possible choices: [‘ARQ’, ‘ARY’, ‘ART’, ‘MRQ’, ‘MRY’, ‘MRT’]. AR=As Reported, MR=Most Recent Reported, Q=Quarterly, Y=Annual, T=Trailing Twelve Month.

output options

-o, --outfile

filename to write the data to (default is stdout)

-j, --json

format output as JSON (default is CSV)

-f, --fields

only return these fields (pass ‘?’ or any invalid fieldname to see available fields))

Query Sharadar Fundamentals from the local database and download to file.

Notes

Usage Guide:

Examples

Query as-reported trailing twelve month (ART) fundamentals for all indicators for a particular sid:

quantrocket fundamental sharadar-fundamentals -i FIBBG12345 --dimensions ART -o aapl_fundamentals.csv

Query as-reported quarterly (ARQ) fundamentals for select indicators for a universe:

quantrocket fundamental sharadar-fundamentals -u usa-stk --dimensions ARQ -f REVENUE EPS -o sharadar_fundamentals.csv

sharadar-insiders

query Sharadar insider holdings data from the local database and download to file

quantrocket fundamental sharadar-insiders [-h] [-s YYYY-MM-DD] [-e YYYY-MM-DD]
                                          [-u [UNIVERSE ...]] [-i [SID ...]]
                                          [--exclude-universes [UNIVERSE ...]]
                                          [--exclude-sids [SID ...]]
                                          [-o OUTFILE] [-j] [-f [FIELD ...]]

filtering options

-s, --start-date

limit to data on or after this filing date

-e, --end-date

limit to data on or before this filing date

-u, --universes

limit to these universes

-i, --sids

limit to these sids

--exclude-universes

exclude these universes

--exclude-sids

exclude these sids

output options

-o, --outfile

filename to write the data to (default is stdout)

-j, --json

format output as JSON (default is CSV)

-f, --fields

only return these fields (pass ‘?’ or any invalid fieldname to see available fields)

Query Sharadar insider holdings data from the local database and download to file.

Notes

Usage Guide:

Examples

Query insider holdings data for a particular sid:

quantrocket fundamental sharadar-insiders -i FIBBG000B9XRY4 -o aapl_insiders.csv

sharadar-institutions

query Sharadar institutional investor data from the local database and download to file

quantrocket fundamental sharadar-institutions [-h] [-s YYYY-MM-DD]
                                              [-e YYYY-MM-DD]
                                              [-u [UNIVERSE ...]]
                                              [-i [SID ...]]
                                              [--exclude-universes [UNIVERSE ...]]
                                              [--exclude-sids [SID ...]]
                                              [-o OUTFILE] [-j]
                                              [-f [FIELD ...]] [-d]

filtering options

-s, --start-date

limit to data on or after this quarter end date

-e, --end-date

limit to data on or before this quarter end date

-u, --universes

limit to these universes

-i, --sids

limit to these sids

--exclude-universes

exclude these universes

--exclude-sids

exclude these sids

output options

-o, --outfile

filename to write the data to (default is stdout)

-j, --json

format output as JSON (default is CSV)

-f, --fields

only return these fields (pass ‘?’ or any invalid fieldname to see available fields)

-d, --detail

query detailed investor data (separate record per investor per security per quarter). If omitted, query data aggregated by security (separate record per security per quarter)

Default: False

Query Sharadar institutional investor data from the local database and download to file.

Notes

Usage Guide:

Examples

Query institutional investor data aggregated by security:

quantrocket fundamental sharadar-institutions -u usa-stk -s 2019-01-01 -o institutions.csv

sharadar-sec8

query Sharadar SEC Form 8-K events data from the local database and download to file

quantrocket fundamental sharadar-sec8 [-h] [-s YYYY-MM-DD] [-e YYYY-MM-DD]
                                      [-u [UNIVERSE ...]] [-i [SID ...]]
                                      [--exclude-universes [UNIVERSE ...]]
                                      [--exclude-sids [SID ...]]
                                      [-c [INT ...]] [-o OUTFILE] [-j]
                                      [-f [FIELD ...]]

filtering options

-s, --start-date

limit to data on or after this filing date

-e, --end-date

limit to data on or before this filing date

-u, --universes

limit to these universes

-i, --sids

limit to these sids

--exclude-universes

exclude these universes

--exclude-sids

exclude these sids

-c, --event-codes

limit to these event codes

output options

-o, --outfile

filename to write the data to (default is stdout)

-j, --json

format output as JSON (default is CSV)

-f, --fields

only return these fields (pass ‘?’ or any invalid fieldname to see available fields)

Query Sharadar SEC Form 8-K events data from the local database and download to file.

Notes

Usage Guide:

Examples

Query event code 13 (Bankruptcy) for a universe of securities:

quantrocket fundamental sharadar-sec8 -u usa-stk --event-codes 13 -o bankruptcies.csv

sharadar-sp500

query Sharadar S&P 500 index changes (additions and removals) from the local database and download to file

quantrocket fundamental sharadar-sp500 [-h] [-s YYYY-MM-DD] [-e YYYY-MM-DD]
                                       [-u [UNIVERSE ...]] [-i [SID ...]]
                                       [--exclude-universes [UNIVERSE ...]]
                                       [--exclude-sids [SID ...]] [-o OUTFILE]
                                       [-j] [-f [FIELD ...]]

filtering options

-s, --start-date

limit to index changes on or after this date

-e, --end-date

limit to index changes on or before this date

-u, --universes

limit to these universes

-i, --sids

limit to these sids

--exclude-universes

exclude these universes

--exclude-sids

exclude these sids

output options

-o, --outfile

filename to write the data to (default is stdout)

-j, --json

format output as JSON (default is CSV)

-f, --fields

only return these fields (pass ‘?’ or any invalid fieldname to see available fields)

Query Sharadar S&P 500 index changes (additions and removals) from the local database and download to file.

Notes

Usage Guide:

Examples

Query S&P 500 index changes since 2010:

quantrocket fundamental sharadar-sp500 -s 2010-01-01 -o sp500_changes.csv

ibkr-shortshares

query intraday or daily Interactive Brokers shortable shares data from the local database and download to file

quantrocket fundamental ibkr-shortshares [-h] [-s YYYY-MM-DD] [-e YYYY-MM-DD]
                                         [-u [UNIVERSE ...]] [-i [SID ...]]
                                         [--exclude-universes [UNIVERSE ...]]
                                         [--exclude-sids [SID ...]]
                                         [-o OUTFILE] [-j] [-a]

filtering options

-s, --start-date

limit to data on or after this date

-e, --end-date

limit to data on or before this date

-u, --universes

limit to these universes

-i, --sids

limit to these sids

--exclude-universes

exclude these universes

--exclude-sids

exclude these sids

output options

-o, --outfile

filename to write the data to (default is stdout)

-j, --json

format output as JSON (default is CSV)

-a, --aggregate

return aggregated daily data containing the min, max, mean, and last shortable share quantities per security per day. If omitted, return intraday data.

Default: False

Query intraday or daily Interactive Brokers shortable shares data from the local database and download to file.

Intraday data timestamps are UTC.

Notes

Usage Guide:

Examples

Query shortable shares for a universe of Australian stocks:

quantrocket fundamental ibkr-shortshares -u asx-stk -o asx_shortables.csv

Query aggregated daily data instead:

quantrocket fundamental ibkr-shortshares -u asx-stk -o asx_shortables.csv --aggregate

ibkr-borrowfees

query Interactive Brokers borrow fees from the local database and download to file

quantrocket fundamental ibkr-borrowfees [-h] [-s YYYY-MM-DD] [-e YYYY-MM-DD]
                                        [-u [UNIVERSE ...]] [-i [SID ...]]
                                        [--exclude-universes [UNIVERSE ...]]
                                        [--exclude-sids [SID ...]]
                                        [-o OUTFILE] [-j]

filtering options

-s, --start-date

limit to data on or after this date

-e, --end-date

limit to data on or before this date

-u, --universes

limit to these universes

-i, --sids

limit to these sids

--exclude-universes

exclude these universes

--exclude-sids

exclude these sids

output options

-o, --outfile

filename to write the data to (default is stdout)

-j, --json

format output as JSON (default is CSV)

Query Interactive Brokers borrow fees from the local database and download to file.

Notes

Usage Guide:

Examples

Query borrow fees for a universe of Australian stocks:

quantrocket fundamental ibkr-borrowfees -u asx-stk -o asx_borrow_fees.csv

ibkr-margin

query Interactive Brokers margin requirements from the local database and download to file

quantrocket fundamental ibkr-margin [-h] [-s YYYY-MM-DD] [-e YYYY-MM-DD]
                                    [-u [UNIVERSE ...]] [-i [SID ...]]
                                    [--exclude-universes [UNIVERSE ...]]
                                    [--exclude-sids [SID ...]] [-o OUTFILE]
                                    [-j]

filtering options

-s, --start-date

limit to data on or after this date

-e, --end-date

limit to data on or before this date

-u, --universes

limit to these universes

-i, --sids

limit to these sids

--exclude-universes

exclude these universes

--exclude-sids

exclude these sids

output options

-o, --outfile

filename to write the data to (default is stdout)

-j, --json

format output as JSON (default is CSV)

Query Interactive Brokers margin requirements from the local database and download to file.

Only stocks with special margin requirements are included in the dataset. Default margin requirements apply to stocks that are omitted from the dataset. 0 in the dataset is a placeholder value that also indicates that default margin requirements apply.

Margin requirements are expressed in percentages, as whole numbers, for example 50 means 50% margin requirement, which is equivalent to 0.5.

Data timestamps are UTC.

Notes

Usage Guide:

Examples

Query margin requirements for a universe of US stocks:

quantrocket fundamental ibkr-margin -u usa-stk -o usa_margin_requirements.csv

alpaca-etb

query Alpaca easy-to-borrow data from the local database and download to file

quantrocket fundamental alpaca-etb [-h] [-s YYYY-MM-DD] [-e YYYY-MM-DD]
                                   [-u [UNIVERSE ...]] [-i [SID ...]]
                                   [--exclude-universes [UNIVERSE ...]]
                                   [--exclude-sids [SID ...]] [-o OUTFILE]
                                   [-j]

filtering options

-s, --start-date

limit to data on or after this date

-e, --end-date

limit to data on or before this date

-u, --universes

limit to these universes

-i, --sids

limit to these sids

--exclude-universes

exclude these universes

--exclude-sids

exclude these sids

output options

-o, --outfile

filename to write the data to (default is stdout)

-j, --json

format output as JSON (default is CSV)

Query Alpaca easy-to-borrow data from the local database and download to file.

Notes

Usage Guide:

Examples

Query easy-to-borrow data for a universe of US stocks:

quantrocket fundamental alpaca-etb -u usa-stk -o usa_etb.csv

collect-wsh

[DEPRECATED] collect Wall Street Horizon upcoming earnings announcement dates from Interactive Brokers and save to database

quantrocket fundamental collect-wsh [-h] [-u [UNIVERSE ...]] [-i [SID ...]]
                                    [-f]

Named Arguments

-u, --universes

limit to these universes (must provide universes, sids, or both)

-i, --sids

limit to these sids (must provide universes, sids, or both)

-f, --force

collect earnings dates for all securities even if they were collected recently (default is to skip securities that were updated in the last 12 hours)

Default: False

Collect Wall Street Horizon upcoming earnings announcement dates from Interactive Brokers and save to database.

DEPRECATED. This data is no longer available from Interactive Brokers except for legacy subscribers.

Examples

Collect upcoming earnings dates for a universe of US stocks:

quantrocket fundamental collect-wsh --universes 'usa-stk'

Collect upcoming earnings dates for a particular security:

quantrocket fundamental collect-wsh --sids FIBBG123456

reuters-financials

[DEPRECATED] query financial statements from the Reuters financials database and download to file

quantrocket fundamental reuters-financials [-h] [-s YYYY-MM-DD]
                                           [-e YYYY-MM-DD] [-u [UNIVERSE ...]]
                                           [-i [SID ...]]
                                           [--exclude-universes [UNIVERSE ...]]
                                           [--exclude-sids [SID ...]] [-q]
                                           [-r] [-o OUTFILE] [-j]
                                           [-f [FIELD ...]]
                                           CODE [CODE ...]

Positional Arguments

CODE

the Chart of Account (COA) code(s) to query

filtering options

-s, --start-date

limit to statements on or after this fiscal period end date

-e, --end-date

limit to statements on or before this fiscal period end date

-u, --universes

limit to these universes

-i, --sids

limit to these sids

--exclude-universes

exclude these universes

--exclude-sids

exclude these sids

-q, --interim

return interim reports (default is to return annual reports, which provide deeper history)

Default: False

-r, --exclude-restatements

exclude restatements (default is to include them)

Default: False

output options

-o, --outfile

filename to write the data to (default is stdout)

-j, --json

format output as JSON (default is CSV)

-f, --fields

only return these fields (pass ‘?’ or any invalid fieldname to see available fields)

Query financial statements from the Reuters financials database and download to file.

DEPRECATED. This data is no longer available from Interactive Brokers. Only data that was previously saved to the local database can be queried.

Examples

Query total revenue (COA code RTLR) for a universe of Australian stocks:

quantrocket fundamental reuters-financials RTLR -u asx-stk -s 2014-01-01 -e 2017-01-01 -o rtlr.csv

Query net income (COA code NINC) from interim reports for two securities (identified by sid) and exclude restatements:

quantrocket fundamental reuters-financials NINC -i FIBBG123456 FIBBG234567 --interim --exclude-restatements -o ninc.csv

Query common and preferred shares outstanding (COA codes QTCO and QTPO) and return a minimal set of fields (several required fields will always be returned)

quantrocket fundamental reuters-financials QTCO QTPO -u nyse-stk --fields Amount -o nyse_float.csv

reuters-estimates

[DEPRECATED] query estimates and actuals from the Reuters estimates database and download to file

quantrocket fundamental reuters-estimates [-h] [-s YYYY-MM-DD] [-e YYYY-MM-DD]
                                          [-u [UNIVERSE ...]] [-i [SID ...]]
                                          [--exclude-universes [UNIVERSE ...]]
                                          [--exclude-sids [SID ...]]
                                          [-t [PERIOD_TYPE ...]] [-o OUTFILE]
                                          [-j] [-f [FIELD ...]]
                                          CODE [CODE ...]

Positional Arguments

CODE

the indicator code(s) to query

filtering options

-s, --start-date

limit to estimates and actuals on or after this fiscal period end date

-e, --end-date

limit to estimates and actuals on or before this fiscal period end date

-u, --universes

limit to these universes

-i, --sids

limit to these sids

--exclude-universes

exclude these universes

--exclude-sids

exclude these sids

-t, --period-types

Possible choices: A, Q, S

limit to these fiscal period types. Possible choices: [‘A’, ‘Q’, ‘S’], where A=Annual, Q=Quarterly, S=Semi-Annual

output options

-o, --outfile

filename to write the data to (default is stdout)

-j, --json

format output as JSON (default is CSV)

-f, --fields

only return these fields (pass ‘?’ or any invalid fieldname to see available fields)

Query estimates and actuals from the Reuters estimates database and download to file.

DEPRECATED. This data is no longer available from Interactive Brokers. Only data that was previously saved to the local database can be queried.

Examples

Query EPS estimates and actuals for a universe of Australian stocks:

quantrocket fundamental reuters-estimates EPS -u asx-stk -s 2014-01-01 -e 2017-01-01 -o eps_estimates.csv

wsh

[DEPRECATED] query earnings announcement dates from the Wall Street Horizon announcements database and download to file

quantrocket fundamental wsh [-h] [-s YYYY-MM-DD] [-e YYYY-MM-DD]
                            [-u [UNIVERSE ...]] [-i [SID ...]]
                            [--exclude-universes [UNIVERSE ...]]
                            [--exclude-sids [SID ...]] [-t [STATUS ...]]
                            [-o OUTFILE] [-j] [-f [FIELD ...]]

filtering options

-s, --start-date

limit to announcements on or after this date

-e, --end-date

limit to announcements on or before this date

-u, --universes

limit to these universes

-i, --sids

limit to these sids

--exclude-universes

exclude these universes

--exclude-sids

exclude these sids

-t, --statuses

Possible choices: Confirmed, Unconfirmed

limit to these confirmation statuses. Possible choices: [‘Confirmed’, ‘Unconfirmed’]

output options

-o, --outfile

filename to write the data to (default is stdout)

-j, --json

format output as JSON (default is CSV)

-f, --fields

only return these fields (pass ‘?’ or any invalid fieldname to see available fields)

Query earnings announcement dates from the Wall Street Horizon announcements database and download to file.

DEPRECATED. This data is no longer available from Interactive Brokers except for legacy subscribers.

Examples

Query earnings dates for a universe of US stocks:

quantrocket fundamental wsh -u usa-stk -s 2019-01-01 -e 2019-04-01 -o announcements.csv
quantrocket.fundamental.collect_sharadar_fundamentals(country='US')

Collect fundamental data from Sharadar and save to database.

Parameters:

country (str, required) – country to collect fundamentals for. Possible choices: US, FREE

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.fundamental.collect_sharadar_insiders(country='US')

Collect insider holdings data from Sharadar and save to database.

Parameters:

country (str, required) – country to collect insider holdings data for. Possible choices: US, FREE

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.fundamental.collect_sharadar_institutions(country='US', detail=False)

Collect institutional investor data from Sharadar and save to database.

Parameters:
  • country (str, required) – country to collect institutional investor data for. Possible choices: US, FREE

  • detail (bool) – if true, collect detailed investor data (separate record per investor per security per quarter). If false (the default), collect data aggregated by security (separate record per security per quarter).

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.fundamental.collect_sharadar_sec8(country='US')

Collect SEC Form 8-K events from Sharadar and save to database.

Parameters:

country (str, required) – country to collect events data for. Possible choices: US, FREE

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.fundamental.collect_sharadar_sp500(country='US')

Collect historical S&P 500 index constituents from Sharadar and save to database.

Parameters:

country (str, required) – country to collect S&P 500 constituents data for. Possible choices: US, FREE

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.fundamental.download_sharadar_fundamentals(filepath_or_buffer=None, start_date=None, end_date=None, universes=None, sids=None, exclude_universes=None, exclude_sids=None, dimensions=None, fields=None, output='csv')

Query Sharadar fundamentals from the local database and download to file.

Parameters:
  • filepath_or_buffer (str or file-like object) – filepath to write the data to, or file-like object (defaults to stdout)

  • output (str) – output format (json, csv, default is csv)

  • start_date (str (YYYY-MM-DD), optional) – limit to fundamentals on or after this fiscal period end date

  • end_date (str (YYYY-MM-DD), optional) – limit to fundamentals on or before this fiscal period end date

  • universes (list of str, optional) – limit to these universes

  • sids (list of str, optional) – limit to these sids

  • exclude_universes (list of str, optional) – exclude these universes

  • exclude_sids (list of str, optional) – exclude these sids

  • dimensions (list of str, optional) – limit to these dimensions. Possible choices: ARQ, ARY, ART, MRQ, MRY, MRT. AR=As Reported, MR=Most Recent Reported, Q=Quarterly, Y=Annual, T=Trailing Twelve Month.

  • fields (list of str, optional) – only return these fields (pass ‘?’ or any invalid fieldname to see available fields)

Return type:

None

Notes

Usage Guide:

Examples

Query as-reported trailing twelve month (ART) fundamentals for all indicators for a particular sid, then load the CSV into Pandas:

>>> download_sharadar_fundamentals(filepath_or_buffer="aapl_fundamentals.csv",
                                   sids="FIBBG265598", dimensions="ART")
>>> fundamentals = pd.read_csv("aapl_fundamentals.csv", parse_dates=["REPORTPERIOD", "DATEKEY", "CALENDARDATE"])

Query as-reported quarterly (ARQ) fundamentals for select indicators for a universe:

>>> download_sharadar_fundamentals(filepath_or_buffer="sharadar_fundamentals.csv",
                                   universes="usa-stk",
                                   dimensions="ARQ", fields=["REVENUE", "EPS"])
quantrocket.fundamental.download_sharadar_insiders(filepath_or_buffer=None, start_date=None, end_date=None, universes=None, sids=None, exclude_universes=None, exclude_sids=None, fields=None, output='csv')

Query Sharadar insider holdings data from the local database and download to file.

Parameters:
  • filepath_or_buffer (str or file-like object) – filepath to write the data to, or file-like object (defaults to stdout)

  • output (str) – output format (json, csv, default is csv)

  • start_date (str (YYYY-MM-DD), optional) – limit to data on or after this filing date

  • end_date (str (YYYY-MM-DD), optional) – limit to data on or before this filing date

  • universes (list of str, optional) – limit to these universes

  • sids (list of str, optional) – limit to these sids

  • exclude_universes (list of str, optional) – exclude these universes

  • exclude_sids (list of str, optional) – exclude these sids

  • fields (list of str, optional) – only return these fields (pass ‘?’ or any invalid fieldname to see available fields)

Return type:

None

Notes

Usage Guide:

Examples

Query insider holdings data for a particular sid, then load the CSV into Pandas:

>>> download_sharadar_insiders(filepath_or_buffer="aapl_insiders.csv",
                                sids="FIBBG000B9XRY4")
>>> insiders = pd.read_csv("aapl_insiders.csv", parse_dates=["FILINGDATE", "TRANSACTIONDATE"])
quantrocket.fundamental.download_sharadar_institutions(filepath_or_buffer=None, start_date=None, end_date=None, universes=None, sids=None, exclude_universes=None, exclude_sids=None, detail=False, fields=None, output='csv')

Query Sharadar institutional investor data from the local database and download to file.

Parameters:
  • filepath_or_buffer (str or file-like object) – filepath to write the data to, or file-like object (defaults to stdout)

  • output (str) – output format (json, csv, default is csv)

  • start_date (str (YYYY-MM-DD), optional) – limit to data on or after this quarter end date

  • end_date (str (YYYY-MM-DD), optional) – limit to data on or before this quarter end date

  • universes (list of str, optional) – limit to these universes

  • sids (list of str, optional) – limit to these sids

  • exclude_universes (list of str, optional) – exclude these universes

  • exclude_sids (list of str, optional) – exclude these sids

  • detail (bool) – if true, query detailed investor data (separate record per investor per security per quarter). If false (the default), query data aggregated by security (separate record per security per quarter).

  • fields (list of str, optional) – only return these fields (pass ‘?’ or any invalid fieldname to see available fields)

Return type:

None

Notes

Usage Guide:

Examples

Query institutional investor data aggregated by security and load the CSV into Pandas:

>>> download_sharadar_institutions(filepath_or_buffer="institutions.csv",
                                    universes="usa-stk", start_date="2019-01-01")
>>> institutions = pd.read_csv("institutions.csv", parse_dates=["CALENDARDATE"])
quantrocket.fundamental.download_sharadar_sec8(filepath_or_buffer=None, start_date=None, end_date=None, universes=None, sids=None, exclude_universes=None, exclude_sids=None, event_codes=None, fields=None, output='csv')

Query Sharadar SEC Form 8-K events data from the local database and download to file.

Parameters:
  • filepath_or_buffer (str or file-like object) – filepath to write the data to, or file-like object (defaults to stdout)

  • output (str) – output format (json, csv, default is csv)

  • start_date (str (YYYY-MM-DD), optional) – limit to data on or after this filing date

  • end_date (str (YYYY-MM-DD), optional) – limit to data on or before this filing date

  • universes (list of str, optional) – limit to these universes

  • sids (list of str, optional) – limit to these sids

  • exclude_universes (list of str, optional) – exclude these universes

  • exclude_sids (list of str, optional) – exclude these sids

  • event_codes (list of int, optional) – limit to these event codes

  • fields (list of str, optional) – only return these fields (pass ‘?’ or any invalid fieldname to see available fields)

Return type:

None

Notes

Usage Guide:

Examples

Query event code 13 (Bankruptcy) for a universe of securities and load into Pandas:

>>> download_sharadar_sec8(filepath_or_buffer="bankruptcies.csv",
                            universes="usa-stk", event_codes=13)
>>> bankruptcies = pd.read_csv("bankruptcies.csv", parse_dates=["DATE"])
quantrocket.fundamental.download_sharadar_sp500(filepath_or_buffer=None, start_date=None, end_date=None, universes=None, sids=None, exclude_universes=None, exclude_sids=None, fields=None, output='csv')

Query Sharadar S&P 500 index changes (additions and removals) from the local database and download to file.

Parameters:
  • filepath_or_buffer (str or file-like object) – filepath to write the data to, or file-like object (defaults to stdout)

  • output (str) – output format (json, csv, default is csv)

  • start_date (str (YYYY-MM-DD), optional) – limit to index changes on or after this date

  • end_date (str (YYYY-MM-DD), optional) – limit to index changes on or before this date

  • universes (list of str, optional) – limit to these universes

  • sids (list of str, optional) – limit to these sids

  • exclude_universes (list of str, optional) – exclude these universes

  • exclude_sids (list of str, optional) – exclude these sids

  • fields (list of str, optional) – only return these fields (pass ‘?’ or any invalid fieldname to see available fields)

Return type:

None

Notes

Usage Guide:

Examples

Query S&P 500 index changes since 2010 and load into Pandas:

>>> download_sharadar_sp500(filepath_or_buffer="sp500_changes.csv", start_date="2010-01-01")
>>> sp500_changes = pd.read_csv("sp500_changes.csv", parse_dates=["DATE"])

Get the current members of the S&P 500:

>>> download_sharadar_sp500(filepath_or_buffer="sp500_changes.csv")
>>> sp500_changes = pd.read_csv("sp500_changes.csv", parse_dates=["DATE"])
>>> latest_changes = sp500_changes.drop_duplicates(subset="Sid", keep="last")
>>> current_members = latest_changes[latest_changes.ACTION == "added"]
quantrocket.fundamental.get_sharadar_fundamentals_reindexed_like(reindex_like, fields=None, dimension='ART', period_offset=0)

Return a multiindex (Field, Date) DataFrame of point-in-time Sharadar fundamentals, reindexed to match the index (dates) and columns (sids) of reindex_like. Financial indicators are forward-filled in order to provide the latest reading at any given date. Indicators are indexed to the Sharadar DATEKEY field, i.e. the filing date. DATEKEY is shifted forward 1 day to avoid lookahead bias.

Parameters:
  • reindex_like (DataFrame, required) – a DataFrame (usually of prices) with dates for the index and sids for the columns, to which the shape of the resulting DataFrame will be conformed

  • fields (list of str) – a list of fields to include in the resulting DataFrame. Defaults to including all fields. For faster performance, limiting fields to those needed is highly recommended, especially for large universes.

  • dimension (str) – the dimension of the data. Defaults to As Reported Trailing Twelve Month (ART). Possible choices: ARQ, ARY, ART, MRQ, MRY, MRT. AR=As Reported, MR=Most Recent Reported, Q=Quarterly, Y=Annual, T=Trailing Twelve Month.

  • period_offset (int, optional) – which fiscal period to return data for. If period_offset is 0 (the default), returns the most recent point-in-time fundamentals as of each date in reindex_like. If period_offset is -1, returns fundamentals for the prior fiscal period as of each date; if -2, two fiscal periods ago, etc. For quarterly and trailing-twelve-month dimensions, previous period means previous quarter, while for annual dimensions, previous period means previous year. Value should be a negative integer or 0.

Returns:

a multiindex (Field, Date) DataFrame of fundamentals, shaped like the input DataFrame

Return type:

DataFrame

Notes

Usage Guide:

Examples

Query several trailing twelve month indicators using a DataFrame of historical prices:

>>> closes = prices.loc["Close"]
>>> fundamentals = get_sharadar_fundamentals_reindexed_like(closes, fields=["EPS", "REVENUE"])
>>> eps = fundamentals.loc["EPS"]
>>> revenue = fundamentals.loc["REVENUE"]

Query quarterly book value per share using a DataFrame of historical prices:

>>> closes = prices.loc["Close"]
>>> fundamentals = get_sharadar_fundamentals_reindexed_like(closes, fields=["BVPS"],
                                                             dimension="ARQ")
>>> bvps = fundamentals.loc["BVPS"]

Query outstanding shares using a DataFrame of historical prices:

>>> closes = prices.loc["Close"]
>>> fundamentals = get_sharadar_fundamentals_reindexed_like(closes,
                                                            fields=["SHARESWA"])
>>> shares_out = fundamentals.loc["SHARESWA"]

Query outstanding shares as of the previous fiscal period:

>>> closes = prices.loc["Close"]
>>> fundamentals = get_sharadar_fundamentals_reindexed_like(closes,
                                                            fields=["SHARESWA"].
                                                            period_offset=-1)
>>> previous_shares_out = fundamentals.loc["SHARESWA"]
quantrocket.fundamental.get_sharadar_institutions_reindexed_like(reindex_like, fields=None, shift=45)

Return a multiindex (Field, Date) DataFrame of Sharadar institutional investor data, reindexed to match the index (dates) and columns (sids) of reindex_like. Values are forward-filled in order to provide the latest reading at any given date. Data are indexed to the quarter end date. Because the reporting deadline is 45 days after the end of the quarter the values are shifted forward 45 calendar days by default (see the shift parameter to control this).

Parameters:
  • reindex_like (DataFrame, required) – a DataFrame (usually of prices) with dates for the index and sids for the columns, to which the shape of the resulting DataFrame will be conformed

  • fields (list of str) – a list of fields to include in the resulting DataFrame. Defaults to including all fields. For faster performance, limiting fields to those needed is highly recommended, especially for large universes.

  • shift (int, optional) – shift the data forward this many period to account for the 45-day lag between the quarter end date and the reporting deadline. Defaults to 45.

Returns:

a multiindex (Field, Date) DataFrame of institutional investor data, shaped like the input DataFrame

Return type:

DataFrame

Notes

Usage Guide:

Examples

Calculate institutional ownership as a percentage of total market cap:

>>> closes = prices.loc["Close"]
>>> insti = get_sharadar_institutions_reindexed_like(closes, fields="SHRVALUE")
>>> insti_share_values = insti.loc["SHRVALUE"]
>>> fundamentals = get_sharadar_fundamentals_reindexed_like(closes, dimension="ARQ", fields="MARKETCAP")
>>> market_caps = fundamentals.loc["MARKETCAP"]
>>> insti_pct = insti_share_values/market_caps
quantrocket.fundamental.get_sharadar_sec8_reindexed_like(reindex_like, event_codes=None)

Return a Boolean DataFrame indicating whether securities filed SEC Form 8-K for specified event codes on given dates. The resulting DataFrame will be reindexed to match the index (dates) and columns (sids) of reindex_like.

Parameters:
  • reindex_like (DataFrame, required) – a DataFrame (usually of prices) with dates for the index and sids for the columns, to which the shape of the resulting DataFrame will be conformed

  • event_codes (list of int, optional) – limit to these event codes

Returns:

a Boolean DataFrame shaped like the input DataFrame

Return type:

DataFrame

Notes

Usage Guide:

Examples

Query bankruptcies (event code 13) and use it to mask a prices DataFrame:

>>> closes = prices.loc["Close"]
>>> filed_for_bankruptcy = get_sharadar_sec8_reindexed_like(closes, event_codes=13)
>>> closes.where(filed_for_bankruptcy)
quantrocket.fundamental.get_sharadar_sp500_reindexed_like(reindex_like)

Return a Boolean DataFrame indicating whether securities were in the S&P 500 on the given dates. The resulting DataFrame will be reindexed to match the index (dates) and columns (sids) of reindex_like.

Parameters:

reindex_like (DataFrame, required) – a DataFrame (usually of prices) with dates for the index and sids for the columns, to which the shape of the resulting DataFrame will be conformed

Returns:

a Boolean DataFrame shaped like the input DataFrame

Return type:

DataFrame

Notes

Usage Guide:

Examples

Query S&P 500 membership and use it to mask a prices DataFrame:

>>> closes = prices.loc["Close"]
>>> are_in_sp500 = get_sharadar_sp500_reindexed_like(closes)
>>> closes.where(are_in_sp500)
quantrocket.fundamental.collect_ibkr_shortable_shares(countries=None)

Collect Interactive Brokers shortable shares data and save to database.

Data is organized by country and updated every 15 minutes. Historical data is available from April 2018. Detailed intraday data as well as aggregated daily data will be saved to the database.

Parameters:

countries (list of str, optional) – limit to these countries (pass ‘?’ or any invalid country to see available countries)

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.fundamental.collect_ibkr_borrow_fees(countries=None)

Collect Interactive Brokers borrow fees data and save to database.

Data is organized by country. Historical data is available from April 2018.

Parameters:

countries (list of str, optional) – limit to these countries (pass ‘?’ or any invalid country to see available countries)

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.fundamental.collect_ibkr_margin_requirements(country)

Collect Interactive Brokers margin requirements data and save to database.

The country parameter refers to the country of the IBKR subsidiary where your account is located. (Margin requirements vary by IBKR subsidiary.) Note that this differs from the IBKR shortable shares or borrow fees APIs, where the countries parameter refers to the country of the security rather than the country of the account.

Historical data is available from April 2018.

Parameters:

country (str, required) – the country of the IBKR subsidiary where your account is located (pass ‘?’ or any invalid country to see available countries)

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.fundamental.download_ibkr_shortable_shares(filepath_or_buffer=None, output='csv', start_date=None, end_date=None, universes=None, sids=None, exclude_universes=None, exclude_sids=None, aggregate=False)

Query intraday or daily Interactive Brokers shortable shares data from the local database and download to file.

Intraday data timestamps are UTC.

Parameters:
  • filepath_or_buffer (str or file-like object) – filepath to write the data to, or file-like object (defaults to stdout)

  • output (str) – output format (json, csv, default is csv)

  • start_date (str (YYYY-MM-DD), optional) – limit to data on or after this date

  • end_date (str (YYYY-MM-DD), optional) – limit to data on or before this date

  • universes (list of str, optional) – limit to these universes

  • sids (list of str, optional) – limit to these sids

  • exclude_universes (list of str, optional) – exclude these universes

  • exclude_sids (list of str, optional) – exclude these sids

  • aggregate (bool) – if True, return aggregated daily data containing the min, max, mean, and last shortable share quantities per security per day. If False or omitted, return intraday data.

Return type:

None

Notes

Usage Guide:

Examples

Query shortable shares for a universe of Australian stocks:

>>> f = io.StringIO()
>>> download_ibkr_shortable_shares("asx_shortables.csv", universes=["asx-stk"])
>>> shortables = pd.read_csv("asx_shortables.csv", parse_dates=["Date"])

Query aggregated daily data instead:

>>> f = io.StringIO()
>>> download_ibkr_shortable_shares("asx_shortables.csv", universes=["asx-stk"], aggregate=True)
>>> shortables = pd.read_csv("asx_shortables.csv", parse_dates=["Date"])
quantrocket.fundamental.download_ibkr_borrow_fees(filepath_or_buffer=None, output='csv', start_date=None, end_date=None, universes=None, sids=None, exclude_universes=None, exclude_sids=None)

Query Interactive Brokers borrow fees from the local database and download to file.

Parameters:
  • filepath_or_buffer (str or file-like object) – filepath to write the data to, or file-like object (defaults to stdout)

  • output (str) – output format (json, csv, default is csv)

  • start_date (str (YYYY-MM-DD), optional) – limit to data on or after this date

  • end_date (str (YYYY-MM-DD), optional) – limit to data on or before this date

  • universes (list of str, optional) – limit to these universes

  • sids (list of str, optional) – limit to these sids

  • exclude_universes (list of str, optional) – exclude these universes

  • exclude_sids (list of str, optional) – exclude these sids

Return type:

None

Notes

Usage Guide:

Examples

Query borrow fees for a universe of Australian stocks.

>>> f = io.StringIO()
>>> download_ibkr_borrow_fees("asx_borrow_fees.csv", universes=["asx-stk"])
>>> borrow_fees = pd.read_csv("asx_borrow_fees.csv", parse_dates=["Date"])
quantrocket.fundamental.download_ibkr_margin_requirements(filepath_or_buffer=None, output='csv', start_date=None, end_date=None, universes=None, sids=None, exclude_universes=None, exclude_sids=None)

Query Interactive Brokers margin requirements from the local database and download to file.

Only stocks with special margin requirements are included in the dataset. Default margin requirements apply to stocks that are omitted from the dataset. 0 in the dataset is a placeholder value that also indicates that default margin requirements apply.

Margin requirements are expressed in percentages, as whole numbers, for example 50 means 50% margin requirement, which is equivalent to 0.5.

Data timestamps are UTC.

Parameters:
  • filepath_or_buffer (str or file-like object) – filepath to write the data to, or file-like object (defaults to stdout)

  • output (str) – output format (json, csv, default is csv)

  • start_date (str (YYYY-MM-DD), optional) – limit to data on or after this date

  • end_date (str (YYYY-MM-DD), optional) – limit to data on or before this date

  • universes (list of str, optional) – limit to these universes

  • sids (list of str, optional) – limit to these sids

  • exclude_universes (list of str, optional) – exclude these universes

  • exclude_sids (list of str, optional) – exclude these sids

Return type:

None

Notes

Usage Guide:

Examples

Query margin requirements for a universe of US stocks:

>>> f = io.StringIO()
>>> download_ibkr_margin_requirements("usa_margin_requirements.csv", universes=["usa-stk"])
>>> margin = pd.read_csv("usa_margin_requirements.csv", parse_dates=["Date"])
quantrocket.fundamental.get_ibkr_shortable_shares_reindexed_like(reindex_like, aggregate=False, time=None, fields=None, shift=0)

Return a DataFrame of Interactive Brokers shortable shares, reindexed to match the index (dates) and columns (sids) of reindex_like.

If aggregate=False (the default), query intraday shortable shares data and return a DataFrame of the quantity of shortable shares as of the time of day specified by time.

If aggregate=True, query shortable shares data aggregated by security and date and return a multiindex DataFrame with levels (Field, Date).

Parameters:
  • reindex_like (DataFrame, required) – a DataFrame (usually of prices) with dates for the index and sids for the columns, to which the shape of the resulting DataFrame will be conformed

  • aggregate (bool) – if True, query data aggregated by security and date and return a multiindex (Field, Date) DataFrame. If False (the default), query intraday data and return a DataFrame of shortable share quantities (with a single-level Date index) as of the time of day specified by time.

  • time (str (HH:MM:SS[ TZ]), optional) – return shortable shares as of this time of day. Only applicable if aggregate=False. If omitted, shortable shares will be returned as of the times of day in reindex_like’s DatetimeIndex. (Note that for a DatetimeIndex containing dates only, the time is 00:00:00, meaning shortable shares will be returned as of midnight at the start of the day.) A time and timezone can be passed as a space-separated string (e.g. “09:30:00 America/New_York”). If timezone is omitted, the timezone of reindex_like’s DatetimeIndex will be used; if reindex_like’s timezone is not set, the timezone will be inferred from the component securities, if all securities share the same timezone.

  • fields (list of str, optional) – limit to these fields. Only applicable if aggregate=True. If omitted, all aggregate fields are included. Available fields are MinQuantity, MaxQuantity, MeanQuantity, and LastQuantity.

  • shift (int, optional) – shift shortable shares this many periods. For example, shift=1 will return the previous day’s shortable shares. By default, values are not shifted, meaning the values reflect the current day’s shortable shares.

Returns:

a DataFrame of shortable shares, shaped like the input DataFrame

Return type:

DataFrame

Notes

Usage Guide:

Examples

Get shortable shares as of midnight for a DataFrame of US stocks:

>>> closes = prices.loc["Close"]
>>> shortables = get_ibkr_shortable_shares_reindexed_like(closes)

Get shortable shares as of 9:20 AM for a DataFrame of US stocks (timezone inferred from component stocks):

>>> closes = prices.loc["Close"]
>>> shortables = get_ibkr_shortable_shares_reindexed_like(closes, time="09:20:00")

Get shortable shares as of 9:20 AM New York time for a multi-timezone DataFrame of stocks:

>>> closes = prices.loc["Close"]
>>> shortables = get_ibkr_shortable_shares_reindexed_like(closes, time="09:20:00 America/New_York")

Get aggregate shortable shares data for a DataFrame of US stocks:

>>> closes = prices.loc["Close"]
>>> shortables = get_ibkr_shortable_shares_reindexed_like(closes, aggregate=True)
>>> min_quantities = shortables.loc["MinQuantity"]
>>> max_quantities = shortables.loc["MaxQuantity"]
>>> mean_quantities = shortables.loc["MeanQuantity"]
>>> last_quantities = shortables.loc["LastQuantity"]
quantrocket.fundamental.get_ibkr_borrow_fees_reindexed_like(reindex_like, shift=0)

Return a DataFrame of Interactive Brokers borrow fees, reindexed to match the index (dates) and columns (sids) of reindex_like.

Parameters:
  • reindex_like (DataFrame, required) – a DataFrame (usually of prices) with dates for the index and sids for the columns, to which the shape of the resulting DataFrame will be conformed

  • shift (int, optional) – shift borrow fees this many periods. For example, shift=1 will return the previous day’s borrow fees. By default, values are not shifted, meaning the values reflect the current day’s borrow fees.

Returns:

a DataFrame of borrow fees, shaped like the input DataFrame

Return type:

DataFrame

Notes

Usage Guide:

Examples

Get borrow fees as of midnight for a DataFrame of US stocks:

>>> closes = prices.loc["Close"]
>>> borrow_fees = get_ibkr_borrow_fees_reindexed_like(closes)
quantrocket.fundamental.get_ibkr_margin_requirements_reindexed_like(reindex_like, time=None, shift=0)

Return a multiindex (Field, Date) DataFrame of Interactive Brokers margin requirements, reindexed to match the index (dates) and columns (sids) of reindex_like.

Returned fields are LongInitialMargin, LongMaintenanceMargin, ShortInitialMargin, and ShortMaintenanceMargin. Margin requirements are expressed in percentages, as whole numbers, for example 50 means 50% margin requirement, which is equivalent to 0.5.

Parameters:
  • reindex_like (DataFrame, required) – a DataFrame (usually of prices) with dates for the index and sids for the columns, to which the shape of the resulting DataFrame will be conformed

  • time (str (HH:MM:SS[ TZ]), optional) – return margin requirements as of this time of day. If omitted, margin requirements will be returned as of the times of day in reindex_like’s DatetimeIndex. (Note that for a DatetimeIndex containing dates only, the time is 00:00:00, meaning margin requirements will be returned as of midnight at the start of the day.) A time and timezone can be passed as a space-separated string (e.g. “09:30:00 America/New_York”). If timezone is omitted, the timezone of reindex_like’s DatetimeIndex will be used; if reindex_like’s timezone is not set, the timezone will be inferred from the component securities, if all securities share the same timezone.

  • shift (int, optional) – shift margin requirements this many periods. For example, shift=1 will return the previous day’s margin requirements. By default, values are not shifted, meaning the values reflect the current day’s margin requirements.

Returns:

a multiindex (Field, Date) DataFrame of margin requirements, shaped like the input DataFrame

Return type:

DataFrame

Notes

Usage Guide:

Examples

Get margin requirements as of midnight for a DataFrame of US stocks and calculate the greater of ShortInitialMargin and ShortMaintenanceMargin:

>>> closes = prices.loc["Close"]
>>> margin_requirements = get_ibkr_margin_requirements_reindexed_like(closes)
>>> short_initial_margins = margin_requirements.loc["ShortInitialMargin"]
>>> short_maintenance_margins = margin_requirements.loc["ShortMaintenanceMargin"]
>>> short_margins = short_initial_margins.where(
    short_initial_margins > short_maintenance_margins,
    short_maintenance_margins)

Get margin requirements as of 4:30 PM for a DataFrame of US stocks (timezone inferred from component stocks):

>>> closes = prices.loc["Close"]
>>> margin_requirements = get_ibkr_margin_requirements_reindexed_like(
    closes, time="16:30:00")

Get margin requirements as of 4:30 PM New York time for a multi-timezone DataFrame of stocks:

>>> closes = prices.loc["Close"]
>>> margin_requirements = get_ibkr_margin_requirements_reindexed_like(
    closes, time="16:30:00 America/New_York")
quantrocket.fundamental.collect_alpaca_etb()

Collect Alpaca easy-to-borrow data and save to database.

Data is updated daily. Historical data is available from March 2019.

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.fundamental.download_alpaca_etb(filepath_or_buffer, output='csv', start_date=None, end_date=None, universes=None, sids=None, exclude_universes=None, exclude_sids=None)

Query Alpaca easy-to-borrow data from the local database and download to file.

Parameters:
  • filepath_or_buffer (str or file-like object) – filepath to write the data to, or file-like object (defaults to stdout)

  • output (str) – output format (json, csv, default is csv)

  • start_date (str (YYYY-MM-DD), optional) – limit to data on or after this date

  • end_date (str (YYYY-MM-DD), optional) – limit to data on or before this date

  • universes (list of str, optional) – limit to these universes

  • sids (list of str, optional) – limit to these sids

  • exclude_universes (list of str, optional) – exclude these universes

  • exclude_sids (list of str, optional) – exclude these sids

Return type:

None

Notes

Usage Guide:

Examples

Query easy-to-borrow data for a universe of US stocks.

>>> f = io.StringIO()
>>> download_alpaca_etb("usa_etb.csv", universes=["usa-stk"])
>>> etb = pd.read_csv("usa_etb.csv", parse_dates=["Date"])
quantrocket.fundamental.get_alpaca_etb_reindexed_like(reindex_like)

Return a DataFrame of Alpaca easy-to-borrow status, reindexed to match the index (dates) and columns (sids) of reindex_like.

Parameters:

reindex_like (DataFrame, required) – a DataFrame (usually of prices) with dates for the index and sids for the columns, to which the shape of the resulting DataFrame will be conformed

Returns:

a Boolean DataFrame indicating easy-to-borrow status, shaped like the input DataFrame

Return type:

DataFrame

Notes

Usage Guide:

Examples

Get easy-to-borrow status for a DataFrame of stocks:

>>> closes = prices.loc["Close"]
>>> are_etb = get_alpaca_etb_reindexed_like(closes)
 

Fundamental Data API

Resource Group

Sharadar Fundamentals

Collect Sharadar Fundamentals
POST/fundamental/sharadar/fundamentals{?country}

Collect fundamental data from Sharadar and save to database.

Example URI

POST http://houston/fundamental/sharadar/fundamentals?country=US
URI Parameters
country
str (required) Example: US

country to collect fundamentals for

Choices: US FREE

Response  202
Headers
Content-Type: application/json
Body
{
  "status": "the fundamental data will be collected asynchronously"
}

Download Sharadar Fundamentals
GET/fundamental/sharadar/fundamentals{filetype}{?start_date,end_date,universes,sids,exclude_universes,exclude_sids,dimensions,fields}

Query Sharadar fundamentals from the local database and download to file.

Example URI

GET http://houston/fundamental/sharadar/fundamentals.csv?start_date=2014-01-01&end_date=2018-01-01&universes=nyse-stk&sids=FI12345&exclude_universes=other-universe&exclude_sids=FI23456&dimensions=ARQ&fields=EPS
URI Parameters
filetype
str (required) Example: .csv

output format

Choices: .csv .json

start_date
str (optional) Example: 2014-01-01

limit to fundamentals on or after this fiscal period end date

end_date
str (optional) Example: 2018-01-01

limit to fundamentals on or before this fiscal period end date

universes
str (optional) Example: nyse-stk

limit to these universes (pass multiple times for multiple universes)

sids
str (optional) Example: FI12345

limit to these sids (pass multiple times for multiple sids)

exclude_universes
str (optional) Example: other-universe

exclude these universes (pass multiple times for multiple universes)

exclude_sids
str (optional) Example: FI23456

exclude these sids (pass multiple times for multiple sids)

dimensions
str (optional) Example: ARQ

limit to these dimensions

Choices: ARQ ARY ART MRQ MRY MRT

fields
str (optional) Example: EPS

only return these fields (pass multiple times for multiple fields)

Response  200
Headers
Content-Type: text/csv

Sharadar Insiders

Collect Sharadar Insiders
POST/fundamental/sharadar/insiders{?country}

Collect insider holdings data from Sharadar and save to database.

Example URI

POST http://houston/fundamental/sharadar/insiders?country=US
URI Parameters
country
str (required) Example: US

country to collect insider holdings data for

Choices: US FREE

Response  202
Headers
Content-Type: application/json
Body
{
  "status": "the insiders data will be collected asynchronously"
}

Download Sharadar Insiders
GET/fundamental/sharadar/insiders{filetype}{?start_date,end_date,universes,sids,exclude_universes,exclude_sids,fields}

Query Sharadar insider holdings data from the local database and download to file.

Example URI

GET http://houston/fundamental/sharadar/insiders.csv?start_date=2014-01-01&end_date=2018-01-01&universes=nyse-stk&sids=FI12345&exclude_universes=other-universe&exclude_sids=FI23456&fields=OWNERNAME
URI Parameters
filetype
str (required) Example: .csv

output format

Choices: .csv .json

start_date
str (optional) Example: 2014-01-01

limit to data on or after this filing date

end_date
str (optional) Example: 2018-01-01

limit to data on or before this filing date

universes
str (optional) Example: nyse-stk

limit to these universes (pass multiple times for multiple universes)

sids
str (optional) Example: FI12345

limit to these sids (pass multiple times for multiple sids)

exclude_universes
str (optional) Example: other-universe

exclude these universes (pass multiple times for multiple universes)

exclude_sids
str (optional) Example: FI23456

exclude these sids (pass multiple times for multiple sids)

fields
str (optional) Example: OWNERNAME

only return these fields (pass multiple times for multiple fields)

Response  200
Headers
Content-Type: text/csv

Sharadar Institutions

Collect Sharadar Institutions
POST/fundamental/sharadar/institutions{?country,detail}

Collect institutional investor data from Sharadar and save to database.

Example URI

POST http://houston/fundamental/sharadar/institutions?country=US&detail=true
URI Parameters
country
str (required) Example: US

country to collect institutional investor data for

Choices: US FREE

detail
bool (optional) Example: true

if true, collect detailed investor data (separate record per investor per security per quarter). If false (the default), collect data aggregated by security (separate record per security per quarter).

Response  202
Headers
Content-Type: application/json
Body
{
  "status": "the institutional investor data will be collected asynchronously"
}

Download Sharadar Institutions
GET/fundamental/sharadar/institutions{filetype}{?detail,start_date,end_date,universes,sids,exclude_universes,exclude_sids,fields}

Query Sharadar institutional investor data from the local database and download to file.

Example URI

GET http://houston/fundamental/sharadar/institutions.csv?detail=true&start_date=2014-01-01&end_date=2018-01-01&universes=nyse-stk&sids=FI12345&exclude_universes=other-universe&exclude_sids=FI23456&fields=INVESTORNAME
URI Parameters
filetype
str (required) Example: .csv

output format

Choices: .csv .json

start_date
str (optional) Example: 2014-01-01

limit to data on or after this quarter end date

end_date
str (optional) Example: 2018-01-01

limit to data on or before this quarter end date

universes
str (optional) Example: nyse-stk

limit to these universes (pass multiple times for multiple universes)

sids
str (optional) Example: FI12345

limit to these sids (pass multiple times for multiple sids)

exclude_universes
str (optional) Example: other-universe

exclude these universes (pass multiple times for multiple universes)

exclude_sids
str (optional) Example: FI23456

exclude these sids (pass multiple times for multiple sids)

fields
str (optional) Example: INVESTORNAME

only return these fields (pass multiple times for multiple fields)

detail
bool (optional) Example: true

if true, query detailed investor data (separate record per investor per security per quarter). If false (the default), query data aggregated by security (separate record per security per quarter).

Response  200
Headers
Content-Type: text/csv

Sharadar SEC Form 8-K

Collect Sharadar SEC 8-K
POST/fundamental/sharadar/sec8{?country}

Collect SEC Form 8-K events from Sharadar and save to database.

Example URI

POST http://houston/fundamental/sharadar/sec8?country=US
URI Parameters
country
str (required) Example: US

country to collect events data for

Choices: US FREE

Response  202
Headers
Content-Type: application/json
Body
{
  "status": "the sec8 data will be collected asynchronously"
}

Download Sharadar SEC 8-K
GET/fundamental/sharadar/sec8{filetype}{?start_date,end_date,event_codes,universes,sids,exclude_universes,exclude_sids,fields}

Query Sharadar SEC Form 8-K events data from the local database and download to file.

Example URI

GET http://houston/fundamental/sharadar/sec8.csv?start_date=2014-01-01&end_date=2018-01-01&event_codes=11&universes=nyse-stk&sids=FI12345&exclude_universes=other-universe&exclude_sids=FI23456&fields=EVENTCODE
URI Parameters
filetype
str (required) Example: .csv

output format

Choices: .csv .json

start_date
str (optional) Example: 2014-01-01

limit to data on or after this filing date

end_date
str (optional) Example: 2018-01-01

limit to data on or before this filing date

universes
str (optional) Example: nyse-stk

limit to these universes (pass multiple times for multiple universes)

sids
str (optional) Example: FI12345

limit to these sids (pass multiple times for multiple sids)

exclude_universes
str (optional) Example: other-universe

exclude these universes (pass multiple times for multiple universes)

exclude_sids
str (optional) Example: FI23456

exclude these sids (pass multiple times for multiple sids)

event_codes
int (optional) Example: 11

limit to these event codes (pass multiple times for multiple event codes)

fields
str (optional) Example: EVENTCODE

only return these fields (pass multiple times for multiple fields)

Response  200
Headers
Content-Type: text/csv

Sharadar S&P 500

Collect Sharadar S&P 500
POST/fundamental/sharadar/sp500{?country}

Collect historical S&P 500 index constituents from Sharadar and save to database.

Example URI

POST http://houston/fundamental/sharadar/sp500?country=US
URI Parameters
country
str (required) Example: US

country to collect S&P 500 constituents data for

Choices: US FREE

Response  202
Headers
Content-Type: application/json
Body
{
  "status": "the sp500 data will be collected asynchronously"
}

Download Sharadar S&P 500
GET/fundamental/sharadar/sp500{filetype}{?start_date,end_date,universes,sids,exclude_universes,exclude_sids,fields}

Query Sharadar S&P 500 index changes (additions and removals) from the local database and download to file.

Example URI

GET http://houston/fundamental/sharadar/sp500.csv?start_date=2014-01-01&end_date=2018-01-01&universes=nyse-stk&sids=FI12345&exclude_universes=other-universe&exclude_sids=FI23456&fields=EVENTCODE
URI Parameters
filetype
str (required) Example: .csv

output format

Choices: .csv .json

start_date
str (optional) Example: 2014-01-01

limit to data on or after this date

end_date
str (optional) Example: 2018-01-01

limit to data on or before this date

universes
str (optional) Example: nyse-stk

limit to these universes (pass multiple times for multiple universes)

sids
str (optional) Example: FI12345

limit to these sids (pass multiple times for multiple sids)

exclude_universes
str (optional) Example: other-universe

exclude these universes (pass multiple times for multiple universes)

exclude_sids
str (optional) Example: FI23456

exclude these sids (pass multiple times for multiple sids)

fields
str (optional) Example: EVENTCODE

only return these fields (pass multiple times for multiple fields)

Response  200
Headers
Content-Type: text/csv

Alpaca Easy-to-Borrow

Collect ETB
POST/fundamental/alpaca/stockloan/etb

Collect Alpaca easy-to-borrow data and save to database.

Data is updated daily. Historical data is available from March 2019.

Example URI

POST http://houston/fundamental/alpaca/stockloan/etb
Response  202
Headers
Content-Type: application/json
Body
{
  "status": "the etb data will be collected asynchronously"
}

Download ETB
GET/fundamental/alpaca/stockloan/etb{filetype}{?start_date,end_date,universes,sids,exclude_universes,exclude_sids}

Query Alpaca easy-to-borrow data from the local database and download to file.

Example URI

GET http://houston/fundamental/alpaca/stockloan/etb.csv?start_date=2018-04-16&end_date=2019-01-01&universes=japan-banks&sids=FI12345&exclude_universes=other-universe&exclude_sids=FI23456
URI Parameters
filetype
str (required) Example: .csv

output format

Choices: .csv .json

start_date
str (optional) Example: 2018-04-16

limit to data on or after this date

end_date
str (optional) Example: 2019-01-01

limit to data on or before this date

universes
str (optional) Example: japan-banks

limit to these universes (pass multiple times for multiple universes)

sids
str (optional) Example: FI12345

limit to these sids (pass multiple times for multiple sids)

exclude_universes
str (optional) Example: other-universe

exclude these universes (pass multiple times for multiple universes)

exclude_sids
str (optional) Example: FI23456

exclude these sids (pass multiple times for multiple sids)

Response  200
Headers
Content-Type: text/csv

IBKR Shortable Shares

Collect Shortable Shares
POST/fundamental/ibkr/stockloan/shares{?countries}

Collect Interactive Brokers shortable shares data and save to database.

Data is organized by country and updated every 15 minutes. Historical data is available from April 2018. Detailed intraday data as well as aggregated daily data will be saved to the database.

Example URI

POST http://houston/fundamental/ibkr/stockloan/shares?countries=usa
URI Parameters
countries
str (optional) Example: usa

limit to these countries (pass ‘?’ or any invalid country to see available countries) (pass multiple times for multiple countries)

Response  202
Headers
Content-Type: application/json
Body
{
  "status": "the shortable shares will be collected asynchronously"
}

Download IBKR Shortable Shares
GET/fundamental/ibkr/stockloan/shares{filetype}{?start_date,end_date,universes,sids,exclude_universes,exclude_sids,aggregate}

Query intraday or daily Interactive Brokers shortable shares data from the local database and download to file.

Intraday data timestamps are UTC.

Example URI

GET http://houston/fundamental/ibkr/stockloan/shares.csv?start_date=2018-04-16&end_date=2019-01-01&universes=japan-banks&sids=FI12345&exclude_universes=other-universe&exclude_sids=FI23456&aggregate=true
URI Parameters
filetype
str (required) Example: .csv

output format

Choices: .csv .json

start_date
str (optional) Example: 2018-04-16

limit to data on or after this date

end_date
str (optional) Example: 2019-01-01

limit to data on or before this date

universes
str (optional) Example: japan-banks

limit to these universes (pass multiple times for multiple universes)

sids
str (optional) Example: FI12345

limit to these sids (pass multiple times for multiple sids)

exclude_universes
str (optional) Example: other-universe

exclude these universes (pass multiple times for multiple universes)

exclude_sids
str (optional) Example: FI23456

exclude these sids (pass multiple times for multiple sids)

aggregate
bool (required) Example: true

if true, return aggregated daily data containing the min, max, mean, and last shortable share quantities per security per day. If false or omitted, return intraday data.

Response  200
Headers
Content-Type: text/csv

IBKR Borrow Fees

Collect IBKR Borrow Fees
POST/fundamental/ibkr/stockloan/fees{?countries}

Collect Interactive Brokers borrow fees data and save to database.

Data is organized by country. Historical data is available from April 2018.

Example URI

POST http://houston/fundamental/ibkr/stockloan/fees?countries=usa
URI Parameters
countries
str (optional) Example: usa

limit to these countries (pass ‘?’ or any invalid country to see available countries) (pass multiple times for multiple countries)

Response  202
Headers
Content-Type: application/json
Body
{
  "status": "the borrow fees will be collected asynchronously"
}

Download IBKR Borrow Fees
GET/fundamental/ibkr/stockloan/fees{filetype}{?start_date,end_date,universes,sids,exclude_universes,exclude_sids}

Query Interactive Brokers borrow fees from the local database and download to file.

Example URI

GET http://houston/fundamental/ibkr/stockloan/fees.csv?start_date=2018-04-16&end_date=2019-01-01&universes=japan-banks&sids=FI12345&exclude_universes=other-universe&exclude_sids=FI23456
URI Parameters
filetype
str (required) Example: .csv

output format

Choices: .csv .json

start_date
str (optional) Example: 2018-04-16

limit to data on or after this date

end_date
str (optional) Example: 2019-01-01

limit to data on or before this date

universes
str (optional) Example: japan-banks

limit to these universes (pass multiple times for multiple universes)

sids
str (optional) Example: FI12345

limit to these sids (pass multiple times for multiple sids)

exclude_universes
str (optional) Example: other-universe

exclude these universes (pass multiple times for multiple universes)

exclude_sids
str (optional) Example: FI23456

exclude these sids (pass multiple times for multiple sids)

Response  200
Headers
Content-Type: text/csv

IBKR Margin Requirements

Collect IBKR Margin Requirements
POST/fundamental/ibkr/stockloan/margin{?countries}

Collect Interactive Brokers margin requirements data and save to database.

The countries parameter refers to the country of the IBKR subsidiary where your account is located. (Margin requirements vary by IBKR subsidiary.) Note that this differs from the IBKR shortable shares or borrow fees APIs, where the countries parameter refers to the country of the security rather than the country of the account.

Historical data is available from April 2018.

Example URI

POST http://houston/fundamental/ibkr/stockloan/margin?countries=usa
URI Parameters
countries
str (required) Example: usa

the country of the IBKR subsidiary where your account is located (pass ‘?’ or any invalid country to see available countries)

Response  202
Headers
Content-Type: application/json
Body
{
  "status": "the margin requirements data will be collected asynchronously"
}

Download IBKR Margin Requirements
GET/fundamental/ibkr/stockloan/margin{filetype}{?start_date,end_date,universes,sids,exclude_universes,exclude_sids}

Query Interactive Brokers margin requirements from the local database and download to file.

Only stocks with special margin requirements are included in the dataset. Default margin requirements apply to stocks that are omitted from the dataset. 0 in the dataset is a placeholder value that also indicates that default margin requirements apply.

Margin requirements are expressed in percentages, as whole numbers, for example 50 means 50% margin requirement, which is equivalent to 0.5.

Data timestamps are UTC.

Example URI

GET http://houston/fundamental/ibkr/stockloan/margin.csv?start_date=2018-04-16&end_date=2019-01-01&universes=japan-banks&sids=FI12345&exclude_universes=other-universe&exclude_sids=FI23456
URI Parameters
filetype
str (required) Example: .csv

output format

Choices: .csv .json

start_date
str (optional) Example: 2018-04-16

limit to data on or after this date

end_date
str (optional) Example: 2019-01-01

limit to data on or before this date

universes
str (optional) Example: japan-banks

limit to these universes (pass multiple times for multiple universes)

sids
str (optional) Example: FI12345

limit to these sids (pass multiple times for multiple sids)

exclude_universes
str (optional) Example: other-universe

exclude these universes (pass multiple times for multiple universes)

exclude_sids
str (optional) Example: FI23456

exclude these sids (pass multiple times for multiple sids)

Response  200
Headers
Content-Type: text/csv

quantrocket.history

historical market data service

QuantRocket historical market data CLI

usage: quantrocket history [-h]
                           {create-custom-db,create-edi-db,create-ibkr-db,create-sharadar-db,create-usstock-db,list,config,drop-db,collect,queue,cancel,wait,get}
                           ...

subcommands

subcommand

Possible choices: create-custom-db, create-edi-db, create-ibkr-db, create-sharadar-db, create-usstock-db, list, config, drop-db, collect, queue, cancel, wait, get

Sub-commands

create-custom-db

create a new database into which custom data can be loaded

quantrocket history create-custom-db [-h] [-z BAR_SIZE] [-c [NAME:TYPE ...]]
                                     CODE

Positional Arguments

CODE

the code to assign to the database (lowercase alphanumerics and hyphens only)

Named Arguments

-z, --bar-size

the bar size that will be loaded. This isn’t enforced but facilitates efficient querying and provides a hint to other parts of the API. Use a Pandas timedelta string, for example, ‘1 day’ or ‘1 min’ or ‘1 sec’.

-c, --columns

the columns to create, specified as ‘name:type’. For example, ‘Close:float’ or ‘Volume:int’. Valid column types are ‘int’, ‘float’, ‘str’, ‘date’, and ‘datetime’. Column names must start with a letter and include only letters, numbers, and underscores. Sid and Date columns are automatically created and need not be specified. For boolean columns, choose type ‘int’ and store 1 or 0.

Create a new database into which custom data can be loaded.

Notes

Usage Guide:

Examples

Create a custom database for loading fundamental data:

quantrocket history create-custom-db custom-fundamentals --bar-size '1 day' --columns Revenue:int EPS:float Currency:str TotalAssets:int

Create a custom database for loading intraday OHCLV data:

quantrocket history create-custom-db custom-stk-1sec --bar-size '1 sec' --columns Open:float High:float Low:float Close:float Volume:int

create-edi-db

create a new database for collecting historical data from EDI

quantrocket history create-edi-db [-h] [-e [MIC ...]] CODE

Positional Arguments

CODE

the code to assign to the database (lowercase alphanumerics and hyphens only)

Named Arguments

-e, --exchanges

one or more exchange codes (MICs) which should be collected

Create a new database for collecting historical data from EDI.

Notes

Usage Guide:

Examples

Create a database for end-of-day China stock prices from EDI:

quantrocket history create-edi-db china-1d -e XSHG XSHE

create-ibkr-db

create a new database for collecting historical data from Interactive Brokers

quantrocket history create-ibkr-db [-h] [-u [UNIVERSE ...]] [-i [SID ...]]
                                   [-s YYYY-MM-DD] [-e YYYY-MM-DD]
                                   [-z BAR_SIZE] [-t BAR_TYPE] [-o] [-p]
                                   [--times [HH:MM:SS ...] | --between-times
                                   HH:MM:SS HH:MM:SS] [--shard HOW]
                                   CODE

Positional Arguments

CODE

the code to assign to the database (lowercase alphanumerics and hyphens only)

Named Arguments

-u, --universes

include these universes

-i, --sids

include these sids

-s, --start-date

collect history back to this start date (default is to collect as far back as data is available)

-e, --end-date

collect history up to this end date (default is to collect up to the present)

-z, --bar-size

Possible choices: 1 secs, 5 secs, 10 secs, 15 secs, 30 secs, 1 min, 2 mins, 3 mins, 5 mins, 10 mins, 15 mins, 20 mins, 30 mins, 1 hour, 2 hours, 3 hours, 4 hours, 8 hours, 1 day, 1 week, 1 month

the bar size to collect. Possible choices: [‘1 secs’, ‘5 secs’, ‘10 secs’, ‘15 secs’, ‘30 secs’, ‘1 min’, ‘2 mins’, ‘3 mins’, ‘5 mins’, ‘10 mins’, ‘15 mins’, ‘20 mins’, ‘30 mins’, ‘1 hour’, ‘2 hours’, ‘3 hours’, ‘4 hours’, ‘8 hours’, ‘1 day’, ‘1 week’, ‘1 month’]

-t, --bar-type

Possible choices: TRADES, ADJUSTED_LAST, MIDPOINT, BID, ASK, BID_ASK, HISTORICAL_VOLATILITY, OPTION_IMPLIED_VOLATILITY

the bar type to collect (if not specified, defaults to MIDPOINT for FX and TRADES for everything else). Possible choices: [‘TRADES’, ‘ADJUSTED_LAST’, ‘MIDPOINT’, ‘BID’, ‘ASK’, ‘BID_ASK’, ‘HISTORICAL_VOLATILITY’, ‘OPTION_IMPLIED_VOLATILITY’]

-o, --outside-rth

include data from outside regular trading hours (default is to limit to regular trading hours)

Default: False

-p, --primary-exchange

limit to data from the primary exchange

Default: False

--times

limit to these times (refers to the bar’s start time; mutually exclusive with –between-times)

--between-times

limit to times between these two times (refers to the bar’s start time; mutually exclusive with –times)

--shard

Possible choices: year, month, day, time, sid, sid,time, off

whether and how to shard the database, i.e. break it into smaller pieces. Required for intraday databases. Possible choices are year (separate database for each year), month (separate database for each year+month), day (separate database for each day), time (separate database for each bar time), sid (separate database for each security), sid,time (duplicate copies of database, one sharded by sid and the other by time),or off (no sharding). See http://qrok.it/h/shard for more help.

Create a new database for collecting historical data from Interactive Brokers.

The historical data requirements you specify when you create a new database (bar size, universes, etc.) are applied each time you collect data for that database.

Notes

Usage Guide:

Examples

Create an end-of-day database called “arca-etf-eod” for a universe called “arca-etf”:

quantrocket history create-ibkr-db 'arca-etf-eod' --universes 'arca-etf' --bar-size '1 day'

Create a similar end-of-day database, but collect primary exchange prices instead of consolidated prices, adjust prices for dividends (=ADJUSTED_LAST), and use an explicit start date:

quantrocket history create-ibkr-db 'arca-etf-eod' -u 'arca-etf' -z '1 day' --primary-exchange --bar-type 'ADJUSTED_LAST' -s 2010-01-01

Create a database of 1-minute bars showing the midpoint for a universe of FX pairs:

quantrocket history create-ibkr-db 'fx-1m' -u 'fx' -z '1 min' --bar-type MIDPOINT

Create a database of 1-second bars just before the open for a universe of Canadian energy stocks in 2016:

quantrocket history create-ibkr-db 'tse-enr-929' -u 'tse-enr' -z '1 secs' --outside-rth --times 09:29:55 09:29:56 09:29:57 09:29:58 09:29:59 -s 2016-01-01 -e 2016-12-31

create-sharadar-db

create a new database for collecting historical data from Sharadar

quantrocket history create-sharadar-db [-h] [-t SEC_TYPE] [-c COUNTRY] CODE

Positional Arguments

CODE

the code to assign to the database (lowercase alphanumerics and hyphens only)

Named Arguments

-t, --sec-type

Possible choices: STK, ETF

the security type to collect. Possible choices: [‘STK’, ‘ETF’]

-c, --country

Possible choices: US, FREE

country to collect data for. Possible choices: [‘US’, ‘FREE’]

Default: “US”

Create a new database for collecting historical data from Sharadar.

Notes

Usage Guide:

Examples

Create a database for Sharadar US stocks and call it “sharadar-us-stk-1d”:

quantrocket history create-sharadar-db sharadar-us-stk-1d --sec-type STK --country US

create-usstock-db

create a new database for collecting historical US stock data from QuantRocket

quantrocket history create-usstock-db [-h] [-z BAR_SIZE] [--free]
                                      [-u {US,FREE}]
                                      CODE

Positional Arguments

CODE

the code to assign to the database (lowercase alphanumerics and hyphens only)

Named Arguments

-z, --bar-size

Possible choices: 1 day

the bar size to collect. Possible choices: [‘1 day’]

--free

limit to free sample data. Default is to collect the full dataset.

Default: False

-u, --universe

Possible choices: US, FREE

[DEPRECATED] whether to collect free sample data or the full dataset. This parameter is deprecated and will be removed in a future release. Please use –free to request free sample data or omit –free to request the full dataset.

Create a new database for collecting historical US stock data from QuantRocket.

Notes

Usage Guide:

Examples

Create a database for end-of-day US stock prices:

quantrocket history create-usstock-db usstock-1d

list

list history databases

quantrocket history list [-h]

List history databases.

Notes

Usage Guide:

Examples

quantrocket history list

config

return the configuration for a history database

quantrocket history config [-h] code

Positional Arguments

code

the database code

Return the configuration for a history database.

Notes

Usage Guide:

Examples

Return the configuration for a database called “jpn-lrg-15m”:

quantrocket history config jpn-lrg-15m

drop-db

delete a history database

quantrocket history drop-db [-h] --confirm-by-typing-db-code-again CODE code

Positional Arguments

code

the database code

Named Arguments

--confirm-by-typing-db-code-again

enter the db code again to confirm you want to drop the database, its config, and all its data

Delete a history database.

Deleting a history database deletes its configuration and data and is irreversible.

Notes

Usage Guide:

Examples

Delete a database called “jpn-lrg-15m”:

quantrocket history drop-db jpn-lrg-15m --confirm-by-typing-db-code-again jpn-lrg-15m

collect

collect historical market data from a vendor and save it to a history database

quantrocket history collect [-h] [-i [SID ...]] [-u [UNIVERSE ...]]
                            [-s YYYY-MM-DD] [-e YYYY-MM-DD] [-p]
                            CODE [CODE ...]

Positional Arguments

CODE

the database code(s) to collect data for

Named Arguments

-i, --sids

collect history for these sids, overriding config (typically used to collect a subset of securities). Only supported for IBKR databases.

-u, --universes

collect history for these universes, overriding config (typically used to collect a subset of securities). Only supported for IBKR databases.

-s, --start-date

collect history back to this start date, overriding config. Only supported for IBKR databases.

-e, --end-date

collect history up to this end date, overriding config. Only supported for IBKR databases.

-p, --priority

use the priority queue (default is to use the standard queue). Only applicable to IBKR databases.

Default: False

Collect historical market data from a vendor and save it to a history database.

The vendor and collection parameters are determined by the stored database configuration as defined at the time the database was created. For certain vendors, collection parameters can be overridden at the time of data collection.

Notes

Usage Guide:

Examples

Collect historical data for a database of Chinese stock prices:

quantrocket history collect china-1d

Collect historical data for an IBKR database of US futures, using the priority queue to jump in front of other queued IBKR collections:

quantrocket history collect cme-10m --priority

queue

get the current queue of historical data collections

quantrocket history queue [-h]

Get the current queue of historical data collections.

Notes

Usage Guide:

Examples

quantrocket history queue

cancel

cancel running or pending historical data collections

quantrocket history cancel [-h] CODE [CODE ...]

Positional Arguments

CODE

the database code(s) to cancel collections for

Cancel running or pending historical data collections.

Notes

Usage Guide:

Examples

Cancel collections for a database called japan-1d:

quantrocket history cancel japan-1d

wait

wait for historical data collection to finish

quantrocket history wait [-h] [-t TIMEDELTA] CODE [CODE ...]

Positional Arguments

CODE

the database code(s) to wait for

Named Arguments

-t, --timeout

time out if data collection hasn’t finished after this much time (use Pandas timedelta string, e.g. 30sec or 5min or 2h)

Wait for historical data collection to finish.

Notes

Usage Guide:

Examples

Wait at most 10 minutes for data collection to finish for a database called ‘fx-1h’:

quantrocket history wait 'fx-1h' -t 10min

get

query historical market data from a history database and download to file

quantrocket history get [-h] [-s YYYY-MM-DD] [-e YYYY-MM-DD]
                        [-u [UNIVERSE ...]] [-i [SID ...]]
                        [--exclude-universes [UNIVERSE ...]]
                        [--exclude-sids [SID ...]] [-t [HH:MM:SS ...]]
                        [-o OUTFILE] [-j] [-f [FIELD ...]] [-c HOW]
                        CODE

Positional Arguments

CODE

the code of the database to query

filtering options

-s, --start-date

limit to history on or after this date

-e, --end-date

limit to history on or before this date

-u, --universes

limit to these universes

-i, --sids

limit to these sids

--exclude-universes

exclude these universes

--exclude-sids

exclude these sids

-t, --times

limit to these times

output options

-o, --outfile

filename to write the data to (default is stdout)

-j, --json

format output as JSON (default is CSV)

-f, --fields

only return these fields (pass ‘?’ or any invalid fieldname to see available fields)

-c, --cont-fut

Possible choices: concat

stitch futures into continuous contracts using this method (default is not to stitch together). Possible choices: [‘concat’]

Query historical market data from a history database and download to file.

Notes

Usage Guide:

Examples

Download a CSV of all historical market data since 2015 from a database called “arca-eod” to a file called arca.csv:

quantrocket history get arca-eod --start-date 2015-01-01 -o arca.csv
quantrocket.history.create_edi_db(code, exchanges)

Create a new database for collecting historical data from EDI.

Parameters:
  • code (str, required) – the code to assign to the database (lowercase alphanumerics and hyphens only)

  • exchanges (list of str, required) – one or more exchange codes (MICs) which should be collected

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.history.create_ibkr_db(code, universes=None, sids=None, start_date=None, end_date=None, bar_size=None, bar_type=None, outside_rth=False, primary_exchange=False, times=None, between_times=None, shard=None)

Create a new database for collecting historical data from Interactive Brokers.

The historical data requirements you specify when you create a new database (bar size, universes, etc.) are applied each time you collect data for that database.

Parameters:
  • code (str, required) – the code to assign to the database (lowercase alphanumerics and hyphens only)

  • universes (list of str) – include these universes

  • sids (list of str) – include these sids

  • start_date (str (YYYY-MM-DD), optional) – collect history back to this start date (default is to collect as far back as data is available)

  • end_date (str (YYYY-MM-DD), optional) – collect history up to this end date (default is to collect up to the present)

  • bar_size (str, required) – the bar size to collect. Possible choices: “1 secs”, “5 secs”, “10 secs”, “15 secs”, “30 secs”, “1 min”, “2 mins”, “3 mins”, “5 mins”, “10 mins”, “15 mins”, “20 mins”, “30 mins”, “1 hour”, “2 hours”, “3 hours”, “4 hours”, “8 hours”, “1 day”, “1 week”, “1 month”

  • bar_type (str, optional) – the bar type to collect (if not specified, defaults to MIDPOINT for FX and TRADES for everything else). Possible choices: “TRADES”, “ADJUSTED_LAST”, “MIDPOINT”, “BID”, “ASK”, “BID_ASK”, “HISTORICAL_VOLATILITY”, “OPTION_IMPLIED_VOLATILITY”

  • outside_rth (bool) – include data from outside regular trading hours (default is to limit to regular trading hours)

  • primary_exchange (bool) – limit to data from the primary exchange (default False)

  • times (list of str (HH:MM:SS), optional) – limit to these times (refers to the bar’s start time; mutually exclusive with between_times)

  • between_times (list of str (HH:MM:SS), optional) – limit to times between these two times (refers to the bar’s start time; mutually exclusive with times)

  • shard (str, optional) – whether and how to shard the database, i.e. break it into smaller pieces. Required for intraday databases. Possible choices are year (separate database for each year), month (separate database for each year+month), day (separate database for each day), time (separate database for each bar time), sid (separate database for each security), sid,time (duplicate copies of database, one sharded by sid and the other by time), or off (no sharding). See http://qrok.it/h/shard for more help.

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.history.create_sharadar_db(code, sec_type, country='US')

Create a new database for collecting historical data from Sharadar.

Parameters:
  • code (str, required) – the code to assign to the database (lowercase alphanumerics and hyphens only)

  • sec_type (str, required) – the security type to collect. Possible choices: STK, ETF

  • country (str, required) – country to collect data for. Possible choices: US, FREE

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.history.create_usstock_db(code, bar_size=None, free=False, universe=None)

Create a new database for collecting historical US stock data from QuantRocket.

Parameters:
  • code (str, required) – the code to assign to the database (lowercase alphanumerics and hyphens only)

  • bar_size (str, optional) – the bar size to collect. Possible choices: 1 day

  • free (bool) – limit to free sample data. Default is to collect the full dataset.

  • universe (str, optional) – [DEPRECATED] whether to collect free sample data or the full dataset. This parameter is deprecated and will be removed in a future release. Please use free=True to request free sample data or free=False (or omit the free parameter) to request the full dataset.

Returns:

status message

Return type:

dict

Notes

Usage Guide:

Examples

Create a database for end-of-day US stock prices:

>>> create_usstock_db('usstock-1d')
quantrocket.history.create_custom_db(code, bar_size=None, columns=None)

Create a new database into which custom data can be loaded.

Parameters:
  • code (str, required) – the code to assign to the database (lowercase alphanumerics and hyphens only)

  • bar_size (str, required) – the bar size that will be loaded. This isn’t enforced but facilitates efficient querying and provides a hint to other parts of the API. Use a Pandas timedelta string, for example, ‘1 day’ or ‘1 min’ or ‘1 sec’.

  • columns (dict of column name:type, required) – the columns to create, specified as a Python dictionary mapping column names to column types. For example, {“Close”:”float”, “Volume”:”int”}. Valid column types are “int”, “float”, “str”, “date”, and “datetime”. Column names must start with a letter and include only letters, numbers, and underscores. Sid and Date columns are automatically created and need not be specified. For boolean columns, choose type ‘int’ and store 1 or 0.

Returns:

status message

Return type:

dict

Notes

Usage Guide:

Examples

Create a custom database for loading fundamental data:

>>> create_custom_db(
        "custom-fundamentals",
        bar_size="1 day",
        columns={
            "Revenue":"int",
            "EPS":"float",
            "Currency":"str",
            "TotalAssets":"int"})

Create a custom database for loading intraday OHCLV data:

>>> create_custom_db(
        "custom-stk-1sec",
        bar_size="1 sec",
        columns={
            "Open":"float",
            "High":"float",
            "Low":"float",
            "Close":"float",
            "Volume":"int"})
quantrocket.history.get_db_config(code)

Return the configuration for a history database.

Parameters:

code (str, required) – the database code

Returns:

config

Return type:

dict

Notes

Usage Guide:

quantrocket.history.drop_db(code, confirm_by_typing_db_code_again=None)

Delete a history database.

Deleting a history database deletes its configuration and data and is irreversible.

Parameters:
  • code (str, required) – the database code

  • confirm_by_typing_db_code_again (str, required) – enter the db code again to confirm you want to drop the database, its config, and all its data

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.history.list_databases()

List history databases.

Returns:

list of database codes

Return type:

list

Notes

Usage Guide:

quantrocket.history.collect_history(codes, sids=None, universes=None, start_date=None, end_date=None, priority=False)

Collect historical market data from a vendor and save it to a history database.

The vendor and collection parameters are determined by the stored database configuration as defined at the time the database was created. For certain vendors, collection parameters can be overridden at the time of data collection.

Parameters:
  • codes (list of str, required) – the database code(s) to collect data for

  • sids (list of str, optional) – collect history for these sids, overriding config (typically used to collect a subset of securities). Only supported for IBKR databases.

  • universes (list of str, optional) – collect history for these universes, overriding config (typically used to collect a subset of securities). Only supported for IBKR databases.

  • start_date (str (YYYY-MM-DD), optional) – collect history back to this start date, overriding config. Only supported for IBKR databases.

  • end_date (str (YYYY-MM-DD), optional) – collect history up to this end date, overriding config. Only supported for IBKR databases.

  • priority (bool) – use the priority queue (default is to use the standard queue). Only applicable to IBKR databases.

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.history.get_history_queue()

Get the current queue of historical data collections.

Returns:

queue by vendor

Return type:

dict

Notes

Usage Guide:

quantrocket.history.cancel_collections(codes)

Cancel running or pending historical data collections.

Parameters:

codes (list of str, required) – the database code(s) to cancel collections for

Returns:

queue by vendor

Return type:

dict

Notes

Usage Guide:

quantrocket.history.wait_for_collections(codes, timeout=None)

Wait for historical data collection to finish.

Parameters:
  • codes (list of str, required) – the database code(s) to wait for

  • timeout (str, optional) – time out if data collection hasn’t finished after this much time (use Pandas timedelta string, e.g. 30sec or 5min or 2h)

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.history.download_history_file(code, filepath_or_buffer=None, output='csv', start_date=None, end_date=None, universes=None, sids=None, exclude_universes=None, exclude_sids=None, times=None, cont_fut=None, fields=None)

Query historical market data from a history database and download to file.

Parameters:
  • code (str, required) – the code of the database to query

  • filepath_or_buffer (str or file-like object) – filepath to write the data to, or file-like object (defaults to stdout)

  • output (str) – output format (json, csv, default is csv)

  • start_date (str (YYYY-MM-DD), optional) – limit to history on or after this date

  • end_date (str (YYYY-MM-DD), optional) – limit to history on or before this date

  • universes (list of str, optional) – limit to these universes (default is to return all securities in database)

  • sids (list of str, optional) – limit to these sids

  • exclude_universes (list of str, optional) – exclude these universes

  • exclude_sids (list of str, optional) – exclude these sids

  • times (list of str (HH:MM:SS), optional) – limit to these times

  • cont_fut (str) – stitch futures into continuous contracts using this method (default is not to stitch together). Possible choices: concat

  • fields (list of str, optional) – only return these fields (pass [‘?’] or any invalid fieldname to see available fields)

Return type:

None

See also

quantrocket.get_prices

load prices into a DataFrame

Notes

Usage Guide:

Examples

You can use StringIO to load the CSV into pandas.

>>> f = io.StringIO()
>>> download_history_file("my-db", f)
>>> history = pd.read_csv(f, parse_dates=["Date"])
 

Historical Data API

Resource Group

Database

Create History Database
PUT/history/databases/{code}{?universes,exchanges,sec_type,country,universe,start_date,end_date,vendor,bar_size,bar_type,outside_rth,primary_exchange,times,between_times,shard,columns}

Create a new history database.

Not all parameters are applicable to all vendors. Please see the Python API reference to determine which parameters are applicable to which vendors.

Example URI

PUT http://houston/history/databases/japan-bank-eod?universes=japan-bank&exchanges=XNYS&sec_type=STK&country=US&universe=US&start_date=2010-06-01&end_date=2019-06-30&vendor=usstock&bar_size=1 day&bar_type=TRADES&outside_rth=false&primary_exchange=true&times=09:29:50&between_times=09:30:00&shard=off&columns=Close:float
URI Parameters
code
str (required) Example: japan-bank-eod

the code to assign to the database (lowercase alphanumerics and hyphens only)

universes
str (optional) Example: japan-bank

include these universes (pass multiple times for multiple universes)

exchanges
str (optional) Example: XNYS

exchange code (MICs) which should be collected (pass multiple times for multiple exchanges)

country
str (optional) Example: US

country to collect data for

Choices: US FREE

sec_type
str (optional) Example: STK

the security type to collect

Choices: STK ETF

universe
str (optional) Example: US

[DEPRECATED] whether to collect free sample data or the full dataset. This parameter is deprecated and will be removed in a future release. Please use free=True to request free sample data or free=False (or omit the free parameter) to request the full dataset.

Choices: US FREE

free
bool (optional) Example: true

limit to free sample data. Default is to collect the full dataset. For vendor usstock.

start_date
str (optional) Example: 2010-06-01

collect history back to this start date (default is to collect as far back as data is available)

end_date
str (optional) Example: 2019-06-30

collect history up to this end date (default is to collect up to the present)

vendor
str (required) Example: usstock

the vendor to collect data from

Choices: custom edi ibkr sharadar usstock

bar_size
str (optional) Example: 1 day

the bar size to collect

Choices: 1 secs 5 secs 10 secs 15 secs 30 secs 1 min 2 mins 3 mins 5 mins 10 mins 15 mins 20 mins 30 mins 1 hour 2 hours 3 hours 4 hours 8 hours 1 day 1 week 1 month

bar_type
str (optional) Example: TRADES

the bar type to collect

Choices: TRADES ADJUSTED_LAST MIDPOINT BID ASK BID_ASK HISTORICAL_VOLATILITY OPTION_IMPLIED_VOLATILITY

outside_rth
bool (required) Example: false

include data from outside regular trading hours (default is to limit to regular trading hours)

primary_exchange
bool (required) Example: true

limit to data from the primary exchange (default False)

times
str (optional) Example: 09:29:50

limit to these times (pass multiple times for multiple times)

between_times
str (optional) Example: 09:30:00

limit to times between these two times (refers to the bar’s start time; mutually exclusive with times; should be passed exactly twice, once for start time and once for end time)

shard
str (optional) Example: off

whether and how to shard the database, i.e. break it into smaller pieces. Required for intraday databases. Possible choices are year (separate database for each year), month (separate database for each year+month), day (separate database for each day), time (separate database for each bar time), sid (separate database for each security), sid,time (duplicate copies of database, one sharded by sid and the other by time), or off (no sharding). See http://qrok.it/h/shard for more help.

Choices: year month day sid time off

columns
str (optional) Example: Close:float

for custom databases, the columns to create, specified as column name:type. For example, “Close:float” or “Volume:int”. Valid column types are “int”, “float”, “text”, “date”, and “datetime”. Column names must start with a letter and include only letters, numbers, and underscores. Sid and Date columns are automatically created and need not be specified. For boolean columns, choose type ‘int’ and store 1 or 0. Pass multiple times for multiple columns.

Response  200
Headers
Content-Type: application/json
Body
{
  "status": "successfully created quantrocket.v2.history.japan-bank-eod.sqlite"
}

Get History Database Config
GET/history/databases/{code}

Return the configuration for a history database.

Example URI

GET http://houston/history/databases/japan-bank-eod
URI Parameters
code
str (required) Example: japan-bank-eod

the database code

Response  200
Headers
Content-Type: application/json
Body
{
  "universes": [
    "japan-bank"
  ],
  "start_date": "2010-06-01",
  "vendor": "ibkr",
  "bar_size": "1 day",
  "bar_type": "TRADES",
  "primary_exchange": true,
  "times": [
    "09:29:50"
  ]
}

Delete History Database
DELETE/history/databases/{code}{?confirm_by_typing_db_code_again}

Delete a history database.

Example URI

DELETE http://houston/history/databases/japan-bank-eod?confirm_by_typing_db_code_again=japan-bank-eod
URI Parameters
code
str (required) Example: japan-bank-eod

the database code

confirm_by_typing_db_code_again
str (required) Example: japan-bank-eod

enter the db code again to confirm you want to drop the database, its config, and all its data

Response  200
Headers
Content-Type: application/json
Body
{
  "status": "deleted quantrocket.v2.history.japan-bank-eod.sqlite"
}

Databases

List History Databases
GET/history/databases

List history databases.

Example URI

GET http://houston/history/databases
Response  200
Headers
Content-Type: application/json
Body
[
  "demo-stk-1d",
  "usa-stk-15min"
]

Historical Data Queue

Collect Historical Data
POST/history/queue{?codes,priority,sids,universes,start_date,end_date}

Collect historical market data from a vendor and save it to a history database.

The vendor and collection parameters are determined by the stored database configuration as defined at the time the database was created. For certain vendors, collection parameters can be overridden at the time of data collection.

Example URI

POST http://houston/history/queue?codes=japan-bank-eod&priority=false&sids=FI12345&universes=japan-bank&start_date=2016-06-01&end_date=2019-06-30
URI Parameters
codes
str (required) Example: japan-bank-eod

the database code(s) to collect data for (pass multiple times for multiple codes)

sids
str (optional) Example: FI12345

collect history for these sids, overriding config (typically used to collect a subset of securities) (pass multiple times for multiple sids)

universes
str (optional) Example: japan-bank

collect history for these universes, overriding config (typically used to collect a subset of securities) (pass multiple times for multiple universes)

start_date
str (optional) Example: 2016-06-01

collect history back to this start date, overriding config

end_date
str (optional) Example: 2019-06-30

collect history up to this end date, overriding config

priority
bool (optional) Example: false

use the priority queue (default is to use the standard queue). Only applicable to IBKR databases.

Response  202
Headers
Content-Type: application/json
Body
{
  "status": "the historical data will be collected asynchronously"
}

Get History Data Queue
GET/history/queue

Get the current queue of historical data collections.

Example URI

GET http://houston/history/queue
Response  200
Headers
Content-Type: application/json

Cancel Historical Data Collection
DELETE/history/queue{?codes}

Cancel running or pending historical data collections.

Example URI

DELETE http://houston/history/queue?codes=japan-bank-eod
URI Parameters
codes
str (required) Example: japan-bank-eod

the database code(s) to cancel collections for (pass multiple times for multiple codes)

Response  200
Headers
Content-Type: application/json

Wait for Historical Data Collection
PUT/history/queue{?codes,timeout}

Wait for historical data collection to finish.

Example URI

PUT http://houston/history/queue?codes=japan-bank-eod&timeout=30sec
URI Parameters
codes
str (required) Example: japan-bank-eod

the database code(s) to wait for (pass multiple times for multiple codes)

timeout
str (optional) Example: 30sec

time out if data collection hasn’t finished after this much time (use Pandas timedelta string, e.g. 30sec or 5min or 2h)

Response  200
Headers
Content-Type: application/json

Historical Market Data

Query Historical Data
GET/history/{code}.{filetype}{?start_date,end_date,universes,sids,exclude_universes,exclude_sids,times,cont_fut,fields}

Query historical market data from a history database and download to file.

Example URI

GET http://houston/history/japan-bank-eod.csv?start_date=2016-06-01&end_date=2017-06-01&universes=japan-bank&sids=FI12345&exclude_universes=other-universe&exclude_sids=FI23456&times=09:29:50&cont_fut=concat&fields=Close
URI Parameters
code
str (required) Example: japan-bank-eod

the code of the database to query

filetype
str (required) Example: csv

output format

Choices: csv json

start_date
str (optional) Example: 2016-06-01

limit to history on or after this date

end_date
str (optional) Example: 2017-06-01

limit to history on or before this date

universes
str (optional) Example: japan-bank

limit to these universes (default is to return all securities in database) (pass multiple times for multiple universes)

sids
str (optional) Example: FI12345

limit to these sids (pass multiple times for multiple sids)

exclude_universes
str (optional) Example: other-universe

exclude these universes (pass multiple times for multiple universes)

exclude_sids
str (optional) Example: FI23456

exclude these sids (pass multiple times for multiple sids)

times
str (optional) Example: 09:29:50

limit to these times (pass multiple times for multiple times)

cont_fut
str (optional) Example: concat

stitch futures into continuous contracts using this method (default is not to stitch together)

Choices: concat

fields
str (optional) Example: Close

only return these fields

Response  200
Headers
Content-Type: text/csv
Body
Sid,Date,Close
FI1715006,2017-01-27,48.65
FI1715006,2017-01-30,47.67
FI1715006,2017-01-31,48.97
FI1715006,2017-02-01,49.26

quantrocket.houston

Houston API gateway

QuantRocket Houston API Gateway CLI

usage: quantrocket houston [-h] {ping} ...

subcommands

subcommand

Possible choices: ping

Sub-commands

ping

ping houston

quantrocket houston ping [-h]

Ping houston.

Examples:

quantrocket houston ping
class quantrocket.houston.Houston

Subclass of requests.Session that provides an interface to the houston API gateway. Reads HOUSTON_URL (and Basic Auth credentials if applicable) from environment variables and applies them to each request. Simply provide the path, starting with /, for example:

>>> response = houston.get("/countdown/crontab")

Since each instance of Houston is a session, you can improve performance by using a single session for all requests. The module provides an instance of Houston, named houston.

Use the same session as other requests:

>>> from quantrocket.houston import houston

Use a new session:

>>> from quantrocket.houston import Houston
>>> houston = Houston()
__init__()
quantrocket.houston.ping()

Pings houston.

Returns:

reply from houston

Return type:

dict

 

Houston API

Only the ping endpoint and generic proxy endpoints are documented below. For service-specific endpoints, see the respective service documentation.

Resource Group

Ping

Ping
GET/ping

Ping houston to test connectivity.

Example URI

GET http://houston/ping
Response  200
Headers
Content-Type: application/json
Body
{
  "msg": "hello from houston"
}

HTTP Proxy

HTTP Proxy
GET/proxy/http/{service_name}/{port}/{path}

Use houston as a reverse proxy to a destination service speaking HTTP. Only GET is shown but the same signature applies for any HTTP method.

Example URI

GET http://houston/proxy/http/myservice/80/my/endpoint
URI Parameters
service_name
str (required) Example: myservice

the destination service name/host name

port
int (required) Example: 80

the destination port

path
str (required) Example: my/endpoint

the request path on the destination host

uWSGI Proxy

uWSGI Proxy
GET/proxy/uwsgi/{service_name}/{port}/{path}

Use houston as a reverse proxy to a destination service speaking the uWSGI protocol. Only GET is shown but the same signature applies for any HTTP method.

Example URI

GET http://houston/proxy/uwsgi/myservice/80/my/endpoint
URI Parameters
service_name
str (required) Example: myservice

the destination service name/host name

port
int (required) Example: 80

the destination port

path
str (required) Example: my/endpoint

the request path on the destination host

quantrocket.ibg

IB Gateway service

QuantRocket IB Gateway service CLI

usage: quantrocket ibg [-h] {credentials,status,start,stop,config} ...

subcommands

subcommand

Possible choices: credentials, status, start, stop, config

Sub-commands

credentials

set username/password and trading mode (paper/live) for IB Gateway

quantrocket ibg credentials [-h] [-u USERNAME] [-p PASSWORD]
                            [--paper | --live]
                            SERVICE_NAME

Positional Arguments

SERVICE_NAME

name of IB Gateway service to set credentials for (for example, ‘ibg1’)

Named Arguments

-u, --username

IBKR username (optional if only modifying trading mode)

-p, --password

IBKR password (if omitted and user is provided, will be prompted for password)

--paper

set trading mode to paper trading

--live

set trading mode to live trading

Set username/password and trading mode (paper/live) for IB Gateway, or view current username and trading mode.

Can be used to set new credentials or switch between paper and live trading (must have previously entered live credentials). Setting new credentials will restart IB Gateway and takes a moment to complete.

Credentials are encrypted at rest and never leave your deployment.

Notes

Usage Guide:

Examples

View current credentials for IB Gateway service named ibg1 (shows username and trading mode only):

quantrocket ibg credentials ibg1

Set credentials for ibg1 (will prompt for password):

quantrocket ibg credentials ibg1 -u myuser --paper

Leave credentials as-is but switch to live trading (must have previously entered live credentials):

quantrocket ibg credentials ibg1 --live

status

query statuses of IB Gateways

quantrocket ibg status [-h] [-s {running,stopped,error}]
                       [-g [SERVICE_NAME ...]]

Named Arguments

-s, --status

Possible choices: running, stopped, error

limit to IB Gateways in this status. Possible choices: [‘running’, ‘stopped’, ‘error’]

-g, --gateways

limit to these IB Gateways

Query statuses of IB Gateways.

Notes

Usage Guide:

Examples

List the status of all gateways:

quantrocket ibg status

Get a list of gateways that are running:

quantrocket ibg status --status running

start

start one or more IB Gateways

quantrocket ibg start [-h] [-g [SERVICE_NAME ...]] [-w]

Named Arguments

-g, --gateways

limit to these IB Gateways

-w, --wait

wait for the IB Gateway to start before returning (default is to start the gateways asynchronously)

Default: False

Start one or more IB Gateways.

Notes

Usage Guide:

Examples

Asynchronously start all gateways (that aren’t already running):

quantrocket ibg start

Start specific gateways and wait for them to come up:

quantrocket ibg start --gateways ibg1 ibg3 --wait

Restart all gateways:

quantrocket ibg stop --wait && quantrocket ibg start

stop

stop one or more IB Gateways

quantrocket ibg stop [-h] [-g [SERVICE_NAME ...]] [-w]

Named Arguments

-g, --gateways

limit to these IB Gateways

-w, --wait

wait for the IB Gateway to stop before returning (default is to stop the gateways asynchronously)

Default: False

Stop one or more IB Gateways.

Notes

Usage Guide:

Examples

Stop all gateways (that aren’t already stopped):

quantrocket ibg stop

Stop specific gateways and wait for them to stop:

quantrocket ibg stop --gateways ibg1 ibg3 --wait

config

upload a new config, or return the current configuration

quantrocket ibg config [-h] [FILENAME]

Positional Arguments

FILENAME

the config file to upload (if omitted, return the current config)

Upload a new IB Gateway permissions config, or return the current configuration.

Permission configs are only necessary when running multiple IB Gateways with differing market data permissions.

Examples

Upload a new config (replaces current config):

quantrocket ibg config myconfig.yml

Show current config:

quantrocket ibg config
quantrocket.ibg.get_credentials(gateway)

Return username and trading mode (paper/live) for IB Gateway.

Parameters:

gateway (str, required) – name of IB Gateway service to get credentials for (for example, ‘ibg1’)

Returns:

credentials

Return type:

dict

Notes

Usage Guide:

quantrocket.ibg.set_credentials(gateway, username=None, password=None, trading_mode=None)

Set username/password and trading mode (paper/live) for IB Gateway.

Can be used to set new credentials or switch between paper and live trading (must have previously entered live credentials). Setting new credentials will restart IB Gateway and takes a moment to complete.

Credentials are encrypted at rest and never leave your deployment.

Parameters:
  • gateway (str, required) – name of IB Gateway service to set credentials for (for example, ‘ibg1’)

  • username (str, optional) – IBKR username (optional if only modifying trading environment)

  • password (str, optional) – IBKR password (if omitted and username is provided, will be prompted for password)

  • trading_mode (str, optional) – the trading mode to use (‘paper’ or ‘live’)

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.ibg.list_gateway_statuses(status=None, gateways=None)

Query statuses of IB Gateways.

Parameters:
  • status (str, optional) – limit to IB Gateways in this status. Possible choices: running, stopped, error

  • gateways (list of str, optional) – limit to these IB Gateways

Returns:

dict of gateway

Return type:

status (if status arg not provided), or list of gateways (if status arg provided)

Notes

Usage Guide:

quantrocket.ibg.start_gateways(gateways=None, wait=False)

Start one or more IB Gateways.

Parameters:
  • gateways (list of str, optional) – limit to these IB Gateways

  • wait (bool) – wait for the IB Gateway to start before returning (default is to start the gateways asynchronously)

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.ibg.stop_gateways(gateways=None, wait=False)

Stop one or more IB Gateways.

Parameters:
  • gateways (list of str, optional) – limit to these IB Gateways

  • wait (bool) – wait for the IB Gateway to stop before returning (default is to stop the gateways asynchronously)

Returns:

status message

Return type:

dict

Notes

Usage Guide:

 

IB Gateway API

Resource Group

IB Gateway Credentials

Set IB Credentials
PUT/{gateway}/credentials{?username,password,trading_mode}

Set IB username/password and trading mode (paper/live) for IB Gateway.

Can be used to set new credentials or switch between paper and live trading (must have previously entered live credentials). Setting new credentials will restart IB Gateway and takes a moment to complete.

Credentials are encrypted at rest and never leave your deployment.

Example URI

PUT http://houston/ibg1/credentials?username=XXXXXXXXX&password=XXXXXXXXX&trading_mode=paper
URI Parameters
gateway
str (required) Example: ibg1

name of IB Gateway service to set credentials for

username
str (optional) Example: XXXXXXXXX

IB username (optional if only modifying trading environment)

password
str (optional) Example: XXXXXXXXX

IB password (optional if only modifying trading environment)

trading_mode
str (optional) Example: paper

the trading mode to use

Choices: paper live

Response  200
Headers
Content-Type: application/json

Get IB Gateway Credentials
GET/{gateway}/credentials

Returns IB username and trading mode (paper/live) for IB Gateway.

Example URI

GET http://houston/ibg1/credentials
URI Parameters
gateway
str (required) Example: ibg1

name of IB Gateway service to get credentials for

Response  200
Headers
Content-Type: application/json
Body
{
    "TWSUSERID": "XXXXXXXXX",
    "TRADING_MODE": "paper",
}

IB Gateway

List Gateways
GET/ibgrouter/gateways{?statuses,gateways}

Query statuses of IB Gateways.

Example URI

GET http://houston/ibgrouter/gateways?statuses=running&gateways=ibg1
URI Parameters
statuses
str (optional) Example: running

limit to IB Gateway services in this status (pass multiple times for multiple statuses)

Choices: running stopped error

gateways
str (optional) Example: ibg1

limit to these IB Gateway services (pass multiple times for multiple gateways)

Response  200
Headers
Content-Type: application/json
Body
{
  "ibg1": "running",
  "ibg2": "stopped"
}

Start Gateways
POST/ibgrouter/gateways{?gateways,wait}

Start IB Gateway services

Example URI

POST http://houston/ibgrouter/gateways?gateways=ibg1&wait=true
URI Parameters
gateways
str (optional) Example: ibg1

limit to these IB Gateway services (pass multiple times for multiple gateways)

wait
bool (required) Example: true

wait for the IB Gateway services to start before returning (default is to start the gateways asynchronously)

Response  200
Headers
Content-Type: application/json
Body
{
  "ibg1": {
    "status": "running"
  },
  "ibg2": {
    "status": "running"
  }
}

Stop Gateways
DELETE/ibgrouter/gateways{?gateways,wait}

Stop IB Gateway services

Example URI

DELETE http://houston/ibgrouter/gateways?gateways=ibg1&wait=true
URI Parameters
gateways
str (optional) Example: ibg1

limit to these IB Gateway services (pass multiple times for multiple gateways)

wait
bool (required) Example: true

wait for the IB Gateway services to stop before returning (default is to stop the gateways asynchronously)

Response  200
Headers
Content-Type: application/json
Body
{
  "ibg1": {
    "status": "stopped"
  },
  "ibg2": {
    "status": "stopped"
  }
}

IB Gateway Config

Get Config
GET/ibgrouter/config

Returns the current IB Gateway permissions config.

Example URI

GET http://houston/ibgrouter/config
Response  200
Headers
Content-Type: application/json
Body
{
  "ibg1": {
    "marketdata": {
      "STK": [
        "ASX",
        "ISLAND",
        "NYSE",
        "TSE"
      ]
    },
    "research": [
      "wsh"
    ]
  },
  "ibg2": {
    "marketdata": {
      "STK": [
        "ISLAND",
        "NYSE"
      ],
      "FUT": [
        "CME"
      ]
    }
  }
}

Load Config
PUT/ibgrouter/config

Upload a new IB Gateway permissions config.

Permission configs are only necessary when running multiple IB Gateways with differing market data permissions.

Example URI

PUT http://houston/ibgrouter/config
Request
Headers
Content-Type: application/x-yaml
Body
ibg1:
marketdata:
    STK:
    - ASX
    - ISLAND
    - NYSE
    - TSE
ibg2:
marketdata:
    STK:
    - ISLAND
    - TSEJ
Response  202
Headers
Content-Type: application/json
Body
{
  "status": "the config will be loaded asynchronously"
}

quantrocket.license

license service

QuantRocket license service CLI

usage: quantrocket license [-h]
                           {get,set,alpaca-key,polygon-key,quandl-key} ...

subcommands

subcommand

Possible choices: get, set, alpaca-key, polygon-key, quandl-key

Sub-commands

get

return the current license profile

quantrocket license get [-h] [--force-refresh]

Named Arguments

--force-refresh

refresh the license profile before returning it (default is to return the cached profile, which is refreshed every few minutes)

Default: False

Return the current license profile.

Notes

Usage Guide:

Examples

View the current license profile:

quantrocket license get

set

set QuantRocket license key

quantrocket license set [-h] LICENSEKEY

Positional Arguments

LICENSEKEY

the license key for your account

Set QuantRocket license key.

Notes

Usage Guide:

Examples

quantrocket license set XXXXXXXXXX

alpaca-key

set Alpaca API key, or view the current API key

quantrocket license alpaca-key [-h] [-a API_KEY] [-s SECRET_KEY]
                               [--paper | --live] [-r DATA_FEED]

Named Arguments

-a, --api-key

Alpaca API key ID

-s, --secret-key

Alpaca secret key (if omitted, will be prompted for secret key)

--paper

set trading mode to paper trading

--live

set trading mode to live trading

-r, --realtime-data

Possible choices: iex, sip

the real-time data feed to which this API key is subscribed. Possible choices: [‘iex’, ‘sip’]. Default is ‘iex’.

Set Alpaca API key, or view the current API key.

Your credentials are encrypted at rest and never leave your deployment.

Notes

Usage Guide:

Examples

View current live and paper API keys:

quantrocket license alpaca-key

Set Alpaca live API key (will prompt for secret key) and specify SIP as the real-time data permission for this account:

quantrocket license alpaca-key --api-key AK123 --live --realtime-data sip

Set Alpaca paper API key (will prompt for secret key):

quantrocket license alpaca-key --api-key PK123 --paper

polygon-key

set Polygon API key, or view the current API key

quantrocket license polygon-key [-h] [API_KEY]

Positional Arguments

API_KEY

Polygon API key

Set Polygon API key, or view the current API key.

Your credentials are encrypted at rest and never leave your deployment.

Notes

Usage Guide:

Examples

View current API key:

quantrocket license polygon-key

Set Polygon API key:

quantrocket license polygon-key K123

quandl-key

set Quandl API key, or view the current API key

quantrocket license quandl-key [-h] [API_KEY]

Positional Arguments

API_KEY

Quandl API key

Set Quandl API key, or view the current API key.

Your credentials are encrypted at rest and never leave your deployment.

Notes

Usage Guide:

Examples

View current API key:

quantrocket license quandl-key

Set Polygon API key:

quantrocket license quandl-key K123
quantrocket.license.get_license_profile(force_refresh=False)

Return the current license profile.

Parameters:

force_refresh (bool) – refresh the license profile before returning it (default is to return the cached profile, which is refreshed every few minutes)

Returns:

license profile

Return type:

dict

Notes

Usage Guide:

quantrocket.license.set_license(key)

Set QuantRocket license key.

Parameters:

key (str, required) – the license key for your account

Returns:

license profile

Return type:

dict

Notes

Usage Guide:

quantrocket.license.get_alpaca_key()

Returns the current API key(s) for Alpaca.

Returns:

credentials

Return type:

dict

Notes

Usage Guide:

quantrocket.license.set_alpaca_key(api_key, trading_mode, secret_key=None, realtime_data='iex')

Set Alpaca API key.

Your credentials are encrypted at rest and never leave your deployment.

Parameters:
  • api_key (str, required) – Alpaca API key ID

  • trading_mode (str, required) – the trading mode of this API key (‘paper’ or ‘live’)

  • secret_key (str, optional) – Alpaca secret key (if omitted, will be prompted for secret key)

  • realtime_data (str, optional) – the real-time data feed to which this API key is subscribed. Possible choices: ‘iex’, ‘sip’. Default is ‘iex’.

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.license.get_polygon_key()

Returns the current API key for Polygon.

Returns:

credentials

Return type:

dict

Notes

Usage Guide:

quantrocket.license.set_polygon_key(api_key)

Set Polygon API key.

Your credentials are encrypted at rest and never leave your deployment.

Parameters:

api_key (str, required) – Polygon API key

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.license.get_quandl_key()

Returns the current API key for Quandl.

Returns:

credentials

Return type:

dict

Notes

Usage Guide:

quantrocket.license.set_quandl_key(api_key)

Set Quandl API key.

Your credentials are encrypted at rest and never leave your deployment.

Parameters:

api_key (str, required) – Quandl API key

Returns:

status message

Return type:

dict

Notes

Usage Guide:

 

License API

Resource Group

License Profile

Get License Profile
GET/license-service/license/{?force_refresh}

Return the current license profile.

Example URI

GET http://houston/license-service/license/?force_refresh=false
URI Parameters
force_refresh
bool (optional) Example: false

refresh the license profile before returning it (default is to return the cached profile, which is refreshed every few minutes)

Response  200
Headers
Content-Type: application/json
Body
{
  "licensekey": "XXXXXXXXXX",
  "software_license": {
    "use_case": "Individual Non-Professional",
    "user_limit": 1,
    "concurrent_install_limit": 2,
    "account": {
      "account_limit": "1000000 USD"
    }
  }
}

Set License Profile
PUT/license-service/license/{key}

Set QuantRocket license key.

Example URI

PUT http://houston/license-service/license/XXXXXXXX
URI Parameters
key
str (required) Example: XXXXXXXX

the license key for your account

Response  200
Headers
Content-Type: application/json
Body
{
  "licensekey": "XXXXXXXXXX",
  "software_license": {
    "use_case": "Individual Non-Professional",
    "user_limit": 1,
    "concurrent_install_limit": 2,
    "account": {
      "account_limit": "1000000 USD"
    }
  }
}

Third-Party API Key

Get API Key
GET/license-service/credentials/{vendor}

Returns the current API key for a third-party provider.

Example URI

GET http://houston/license-service/credentials/alpaca
URI Parameters
vendor
str (required) Example: alpaca

the vendor to return credentials for

Choices: alpaca polygon quandl

Response  200
Headers
Content-Type: application/json

Set API Key
PUT/license-service/credentials/{vendor}{?api_key,secret_key,trading_mode,realtime_data}

Set API key for a third-party provider.

Your credentials are encrypted at rest and never leave your deployment.

Example URI

PUT http://houston/license-service/credentials/alpaca?api_key=XXXXXXXX&secret_key=ZZZZZZZZZZ&trading_mode=paper&realtime_data=sip
URI Parameters
vendor
str (required) Example: alpaca

the vendor to set credentials for

Choices: alpaca polygon quandl

api_key
str (required) Example: XXXXXXXX

the third-party API key

secret_key
str (optional) Example: ZZZZZZZZZZ

the third-party secret key, if applicable

trading_mode
str (optional) Example: paper

the trading mode, if applicable

realtime_data
str (optional) Example: sip

(alpaca only) the real-time data feed to which this API key is subscribed

Choices: iex sip

Response  200
Headers
Content-Type: application/json

quantrocket.master

securities master service

QuantRocket securities master CLI

usage: quantrocket master [-h]
                          {collect-alpaca,collect-edi,collect-figi,collect-ibkr,collect-sharadar,collect-usstock,collect-ibkr-options,get,list-ibkr-exchanges,diff-ibkr,delist-ibkr,list-universes,universe,delete-universe,create-ibkr-combo,rollrules,collect-ibkr-calendar,calendar,isopen,isclosed,ticksize}
                          ...

subcommands

subcommand

Possible choices: collect-alpaca, collect-edi, collect-figi, collect-ibkr, collect-sharadar, collect-usstock, collect-ibkr-options, get, list-ibkr-exchanges, diff-ibkr, delist-ibkr, list-universes, universe, delete-universe, create-ibkr-combo, rollrules, collect-ibkr-calendar, calendar, isopen, isclosed, ticksize

Sub-commands

collect-alpaca

collect securities listings from Alpaca and store in securities master database

quantrocket master collect-alpaca [-h]

Collect securities listings from Alpaca and store in securities master database.

Notes

Usage Guide:

Examples

quantrocket master collect-alpaca

collect-edi

collect securities listings from EDI and store in securities master database

quantrocket master collect-edi [-h] [-e [MIC ...]]

Named Arguments

-e, --exchanges

collect listings for these exchanges (identified by MICs)

Collect securities listings from EDI and store in securities master database.

Notes

Usage Guide:

Examples

Collect sample listings:

quantrocket master collect-edi --exchanges FREE

Collect listings for all permitted exchanges

quantrocket master collect-edi

Collect all Chinese stock listings:

quantrocket master collect-edi -e XSHG XSHE

collect-figi

collect securities listings from Bloomberg OpenFIGI and store in securities master database

quantrocket master collect-figi [-h]

Collect securities listings from Bloomberg OpenFIGI and store in securities master database.

OpenFIGI provides several useful security attributes including market sector, a detailed security type, and share class-level FIGI identifier.

The collected data fields show up in the master file under the prefix “figi_*”.

This command does not directly query the OpenFIGI API but rather downloads a dump of all FIGIs which QuantRocket has previously mapped to securities from other vendors.

Notes

Usage Guide:

Examples

quantrocket master collect-figi

collect-ibkr

collect securities listings from Interactive Brokers and store in securities master database

quantrocket master collect-ibkr [-h] [-e [EXCHANGE ...]] [-t [SEC_TYPE ...]]
                                [-c [CURRENCY ...]] [-s [SYMBOL ...]]
                                [-u [UNIVERSE ...]] [-i [SID ...]]

Named Arguments

-e, --exchanges

one or more exchange codes to collect listings for (required unless providing universes or sids). For sample data use exchange code ‘FREE’

-t, --sec-types

Possible choices: STK, ETF, FUT, CASH, IND

limit to these security types. Possible choices: [‘STK’, ‘ETF’, ‘FUT’, ‘CASH’, ‘IND’]

-c, --currencies

limit to these currencies

-s, --symbols

limit to these symbols

-u, --universes

limit to these universes

-i, --sids

limit to these sids

Collect securities listings from Interactive Brokers and store in securities master database.

Specify an exchange (optionally filtering by security type, currency, and/or symbol) to collect listings from the IBKR website and collect associated contract details from the IBKR API. Or, specify universes or sids to collect details from the IBKR API, bypassing the website.

Notes

Usage Guide:

Examples

Collect free sample listings:

quantrocket master collect-ibkr --exchanges FREE

Collect all Toronto Stock Exchange stock listings:

quantrocket master collect-ibkr --exchanges TSE --sec-types STK

Collect all NYSE ARCA ETF listings:

quantrocket master collect-ibkr -e ARCA --sec-types ETF

Collect specific symbols from Nasdaq:

quantrocket master collect-ibkr -e NASDAQ --symbols AAPL GOOG NFLX

Re-collect contract details for an existing universe called “japan-fin”:

quantrocket master collect-ibkr --universes japan-fin

collect-sharadar

collect securities listings from Sharadar and store in securities master database

quantrocket master collect-sharadar [-h] [-c [COUNTRY ...]]

Named Arguments

-c, --countries

Possible choices: US, FREE

collect listings for these countries. Possible choices: [‘US’, ‘FREE’]

Default: [‘US’]

Collect securities listings from Sharadar and store in securities master database.

Notes

Usage Guide:

Examples

Collect sample listings:

quantrocket master collect-sharadar --countries FREE

Collect all US listings:

quantrocket master collect-sharadar --countries US

collect-usstock

collect US stock listings from QuantRocket and store in securities master database

quantrocket master collect-usstock [-h]

Collect US stock listings from QuantRocket and store in securities master database.

Notes

Usage Guide:

Examples

quantrocket master collect-usstock

collect-ibkr-options

collect IBKR option chains for underlying securities

quantrocket master collect-ibkr-options [-h] [-u [UNIVERSE ...]]
                                        [-i [SID ...]] [-f INFILE]

Named Arguments

-u, --universes

collect options for these universes of underlying securities

-i, --sids

collect options for these underlying sids

-f, --infile

collect options for the sids in this file (specify ‘-’ to read file from stdin)

Collect IBKR option chains for underlying securities.

Note: option chains often consist of hundreds, sometimes thousands of options per underlying security. Be aware that requesting option chains for large universes of underlying securities, such as all stocks on the NYSE, can take numerous hours to complete.

Notes

Usage Guide:

Examples

Collect option chains for several underlying securities:

quantrocket master collect-ibkr-options --sids FIBBG000LV0836 FIBBG000B9XRY4

Collect option chains for NQ futures:

quantrocket master get -e CME -s NQ -t FUT | quantrocket master collect-ibkr-options -f -

Collect option chains for a large universe of stocks called “nyse-stk” (see note above):

quantrocket master collect-ibkr-options -u "nyse-stk"

get

query security details from the securities master database and download to file

quantrocket master get [-h] [-e [EXCHANGE ...]] [-t [SEC_TYPE ...]]
                       [-c [CURRENCY ...]] [-u [UNIVERSE ...]]
                       [-s [SYMBOL ...]] [-i [SID ...]]
                       [--exclude-universes [UNIVERSE ...]]
                       [--exclude-sids [SID ...]] [--exclude-delisted]
                       [--exclude-expired] [-m] [-v [VENDOR ...]] [-o OUTFILE]
                       [-j] [-f [FIELD ...]]

filtering options

-e, --exchanges

limit to these exchanges. You can specify exchanges using the MIC or the vendor’s exchange code.

-t, --sec-types

Possible choices: STK, ETF, FUT, CASH, IND, OPT, FOP, BAG

limit to these security types. Possible choices: [‘STK’, ‘ETF’, ‘FUT’, ‘CASH’, ‘IND’, ‘OPT’, ‘FOP’, ‘BAG’]

-c, --currencies

limit to these currencies

-u, --universes

limit to these universes

-s, --symbols

limit to these symbols

-i, --sids

limit to these sids

--exclude-universes

exclude these universes

--exclude-sids

exclude these sids

--exclude-delisted

exclude delisted securities (default is to include them)

Default: False

--exclude-expired

exclude expired contracts (default is to include them)

Default: False

-m, --frontmonth

exclude backmonth and expired futures contracts

Default: False

-v, --vendors

Possible choices: alpaca, edi, ibkr, sharadar, usstock

limit to these vendors. Possible choices: [‘alpaca’, ‘edi’, ‘ibkr’, ‘sharadar’, ‘usstock’]

output options

-o, --outfile

filename to write the data to (default is stdout)

-j, --json

format output as JSON (default is CSV)

-f, --fields

return specific fields. By default a core set of fields is returned, but additional vendor-specific fields are also available. To return non-core fields, you can reference them by name, or pass “*” to return all available fields. To return all fields for a specific vendor, pass the vendor prefix followed by “*”, for example “edi*” for all EDI fields. Pass “?*” (or any invalid vendor prefix plus “*”) to see available vendor prefixes. Pass “?” or any invalid fieldname to see all available fields.

Query security details from the securities master database and download to file.

Notes

Parameters for filtering query results are combined according to the following rules. First, the master service determines what to include in the result set, based on the inclusion filters: –exchanges, –sec-types, –currencies, –universes, –symbols, and –sids. With the exception of –sids, these parameters are ANDed together. That is, securities must satisfy all of the parameters to be included. If –vendors is provided, only those vendors are searched for the purpose of determining matches.

The –sids parameter is treated differently. Securities matching –sids are always included, regardless of whether they meet the other inclusion criteria.

After determining what to include, the master service then applies the exclusion filters (–exclude-sids, –exclude-universes, –exclude-delisted, –exclude-expired, and –frontmonth) to determine what (if anything) should be removed from the result set. Exclusion filters are ORed, that is, securities are excluded if they match any of the exclusion criteria.

Usage Guide:

Examples

Download NYSE and NASDAQ securities to file, using MICs to specify the exchanges:

quantrocket master get --exchanges XNYS XNAS -o securities.csv

Download NYSE and NASDAQ securities to file, using IBKR exchange codes to specify the exchanges, and include all IBKR fields:

quantrocket master get --exchanges NYSE NASDAQ -f 'ibkr*' -o securities.csv

Download a CSV of all ARCA ETFs and use it to create a universe called “arca-etf”:

quantrocket master get --exchanges ARCA --sec-types ETF | quantrocket master universe "arca-etf" --infile -

Query the exchange and currency for all listings of AAPL and format for terminal display:

quantrocket master get --symbols AAPL --fields Exchange Currency | csvlook -I

list-ibkr-exchanges

list exchanges by security type and country as found on the IBKR website

quantrocket master list-ibkr-exchanges [-h] [-r [REGION ...]]
                                       [-t [SEC_TYPE ...]]

Named Arguments

-r, --regions

Possible choices: north_america, europe, asia, global

limit to these regions. Possible choices: [‘north_america’, ‘europe’, ‘asia’, ‘global’]

-t, --sec-types

Possible choices: STK, ETF, FUT, CASH, IND

limit to these security types. Possible choices: [‘STK’, ‘ETF’, ‘FUT’, ‘CASH’, ‘IND’]

List exchanges by security type and country as found on the IBKR website.

Notes

Usage Guide:

Examples

List all exchanges:

quantrocket master list-ibkr-exchanges

List stock exchanges in North America:

quantrocket master list-ibkr-exchanges --regions north_america --sec-types STK

diff-ibkr

flag security details that have changed in IBKR’s system since the time they were last collected into the securities master database

quantrocket master diff-ibkr [-h] [-u [UNIVERSE ...]] [-i [SID ...]]
                             [-n INFILE] [-f [FIELD ...]] [--delist-missing]
                             [--delist-exchanges [EXCHANGE ...]] [-w]

Named Arguments

-u, --universes

limit to these universes

-i, --sids

limit to these sids

-n, --infile

limit to the sids in this file (specify ‘-’ to read file from stdin)

-f, --fields

only diff these fields (field name should start with ‘ibkr’)

--delist-missing

auto-delist securities that are no longer available from IBKR

Default: False

--delist-exchanges

auto-delist securities that are associated with these exchanges

-w, --wait

run the diff synchronously and return the diff (otherwise run asynchronously and log the results, if any, to flightlog

Default: False

Flag security details that have changed in IBKR’s system since the time they were last collected into the securities master database.

Diff can be run synchronously or asynchronously (asynchronous is the default and is recommended if diffing more than a handful of securities).

Notes

Usage Guide:

Examples

Asynchronously generate a diff for all securities in a universe called “italy-stk” and log the results, if any, to flightlog:

quantrocket master diff-ibkr -u "italy-stk"

Asynchronously generate a diff for all securities in a universe called “italy-stk”, looking only for sector or industry changes:

quantrocket master diff-ibkr -u "italy-stk" --fields ibkr_Sector ibkr_Industry

Synchronously get a diff for specific securities by sid:

quantrocket master diff-ibkr --sids FIBBG000LV0836 FIBBG000B9XRY4 --wait

Synchronously get a diff for specific securities without knowing their sids:

quantrocket master get -e NASDAQ -t STK -s AAPL FB GOOG | quantrocket master diff-ibkr --wait --infile -

Asynchronously generate a diff for all securities in a universe called “nasdaq-sml” and auto-delist any symbols that are no longer available from IBKR or that are now associated with the PINK exchange:

quantrocket master diff-ibkr -u "nasdaq-sml" --delist-missing --delist-exchanges PINK

delist-ibkr

mark an IBKR security as delisted

quantrocket master delist-ibkr [-h] [-i SID] [-s SYMBOL] [-e EXCHANGE]
                               [-c CURRENCY] [-t SEC_TYPE]

Named Arguments

-i, --sid

the sid of the security to be delisted

-s, --symbol

the symbol to be delisted (if sid not provided)

-e, --exchange

the exchange of the security to be delisted (if needed to disambiguate)

-c, --currency

the currency of the security to be delisted (if needed to disambiguate)

-t, --sec-type

Possible choices: STK, ETF, FUT, CASH, IND

the security type of the security to be delisted (if needed to disambiguate). Possible choices: [‘STK’, ‘ETF’, ‘FUT’, ‘CASH’, ‘IND’]

Mark an IBKR security as delisted.

This does not remove any data but simply marks the security as delisted so that data services won’t attempt to collect data for the security and so that the security can be optionally excluded from query results.

The security can be specified by sid or a combination of other parameters (for example, symbol + exchange). As a precaution, the request will fail if the parameters match more than one security.

Notes

Usage Guide:

Examples

Delist a security by sid:

quantrocket master delist-ibkr -i FIBBG1234567890

Delist a security by symbol + exchange:

quantrocket master delist-ibkr -s ABC -e NYSE

list-universes

list universes and their size

quantrocket master list-universes [-h]

List universes and their size.

Notes

Usage Guide:

Examples

quantrocket master list-universes

universe

create a universe of securities

quantrocket master universe [-h] [-f INFILE] [-i [SID ...]]
                            [--from-universes [UNIVERSE ...]]
                            [--exclude-delisted] [-a | -r]
                            CODE

Positional Arguments

CODE

the code to assign to the universe (lowercase alphanumerics and hyphens only)

Named Arguments

-f, --infile

create the universe from the sids in this file (specify ‘-’ to read file from stdin)

-i, --sids

create the universe from these sids

--from-universes

create the universe from these existing universes

--exclude-delisted

exclude delisted securities and expired contracts that would otherwise be included (default is to include them)

Default: False

-a, --append

append to universe if universe already exists

Default: False

-r, --replace

replace universe if universe already exists

Default: False

Create a universe of securities.

Notes

Usage Guide:

Examples

Download a CSV of Italian stocks then upload it to create a universe called “italy-stk”:

quantrocket master get --exchanges BVME --sec-types STK -f italy.csv
quantrocket master universe "italy-stk" -f italy.csv

In one line, download a CSV of all ARCA ETFs and append to a universe called “arca-etf”:

quantrocket master get --exchanges ARCA --sec-types ETF | quantrocket master universe "arca-etf" --append --infile -

Create a universe consisting of several existing universes:

quantrocket master universe "asx" --from-universes "asx-sml" "asx-mid" "asx-lrg"

Copy a universe but exclude delisted securities:

quantrocket master universe "hong-kong-active" --from-universes "hong-kong" --exclude-delisted

delete-universe

delete a universe

quantrocket master delete-universe [-h] code

Positional Arguments

code

the universe code

Delete a universe.

The listings details of the member securities won’t be deleted, only their grouping as a universe.

Notes

Usage Guide:

Examples

Delete the universe called “italy-stk”:

quantrocket master delete-universe 'italy-stk'

create-ibkr-combo

Create an IBKR combo (aka spread)

quantrocket master create-ibkr-combo [-h] PATH

Positional Arguments

PATH

a JSON file containing an array of the combo legs, where each leg is an array specifying action, ratio, and sid

Create an IBKR combo (aka spread), which is a composite instrument consisting of two or more individual instruments (legs) that are traded as a single instrument.

Each user-defined combo is stored in the securities master database with a SecType of “BAG”. The combo legs are stored in the ComboLegs field as a JSON array. QuantRocket assigns a sid for the combo consisting of a prefix ‘IC’ followed by an autoincrementing digit, for example: IC1, IC2, IC3, …

If the combo already exists, its sid will be returned instead of creating a duplicate record.

Notes

Usage Guide:

Examples

Create a spread from a JSON file:

cat spread.json
[["BUY", 1, QF12345],
 ["SELL", 1, QF23456]]

quantrocket master create-ibkr-combo spread.json

rollrules

upload a new rollover rules config, or return the current rollover rules

quantrocket master rollrules [-h] [FILENAME]

Positional Arguments

FILENAME

the rollover rules YAML config file to upload (if omitted, return the current config)

Upload a new rollover rules config, or return the current rollover rules.

Examples

Upload a new rollover config (replaces current config):

quantrocket master rollrules myrolloverrules.yml

Show current rollover config:

quantrocket master rollrules

collect-ibkr-calendar

collect upcoming trading hours from IBKR for exchanges and save to securities master database

quantrocket master collect-ibkr-calendar [-h] [-e [EXCHANGE ...]]

Named Arguments

-e, --exchanges

limit to these exchanges

Collect upcoming trading hours from IBKR for exchanges and save to securities master database.

Notes

Usage Guide:

Examples

Collect trading hours for ARCA:

quantrocket master collect-ibkr-calendar -e ARCA

calendar

check whether exchanges are open or closed

quantrocket master calendar [-h] [-t SEC_TYPE] [-i TIMEDELTA | -a TIMEDELTA]
                            [-o]
                            EXCHANGE [EXCHANGE ...]

Positional Arguments

EXCHANGE

the exchange(s) to check

Named Arguments

-t, --sec-type

Possible choices: STK, FUT, CASH, OPT

the security type, if needed to disambiguate for exchanges that trade multiple security types. Possible choices: [‘STK’, ‘FUT’, ‘CASH’, ‘OPT’]

-i, --in

check whether exchanges will be open or closed at this point in the future (use Pandas timedelta string, e.g. 2h or 30min or 1d)

-a, --ago

check whether exchanges were open or closed this long ago (use Pandas timedelta string, e.g. 2h or 30min or 1d)

-o, --outside-rth

check extended hours calendar (default is to check regular trading hours calendar)

Default: False

Check whether exchanges are open or closed.

Notes

Usage Guide:

Examples

Check whether NYSE is open or closed now:

quantrocket master calendar NYSE

Check whether the Tokyo Stock Exchange was open or closed 5 hours ago:

quantrocket master calendar TSEJ --ago 5h

Check whether CME will be open or closed in 30 minutes:

quantrocket master calendar CME --in 30min

isopen

assert that one or more exchanges are open and exit non-zero if closed

quantrocket master isopen [-h] [-t SEC_TYPE] [-i TIMEDELTA | -a TIMEDELTA]
                          [-s FREQ | -u FREQ] [-o]
                          EXCHANGE [EXCHANGE ...]

Positional Arguments

EXCHANGE

the exchange(s) to check

Named Arguments

-t, --sec-type

Possible choices: STK, FUT, CASH, OPT

the security type, if needed to disambiguate for exchanges that trade multiple security types. Possible choices: [‘STK’, ‘FUT’, ‘CASH’, ‘OPT’]

-i, --in

assert that exchanges will be open at this point in the future (use Pandas timedelta string, e.g. 2h or 30min or 1d)

-a, --ago

assert that exchanges were open this long ago (use Pandas timedelta string, e.g. 2h or 30min or 1d)

-s, --since

assert that exchanges have been opened (as of –in or –ago if applicable) since at least this time (use Pandas frequency string, e.g. ‘W’ (week end), ‘M’ (month end), ‘Q’ (quarter end), ‘A’ (year end))

-u, --until

assert that exchanges will be opened (as of –in or –ago if applicable) until at least this time (use Pandas frequency string, e.g. ‘W’ (week end), ‘M’ (month end), ‘Q’ (quarter end), ‘A’ (year end))

-o, --outside-rth

check extended hours calendar (default is to check regular trading hours calendar)

Default: False

Assert that one or more exchanges are open and exit non-zero if closed.

Intended to be used as a conditional for running other commands.

Notes

Usage Guide:

Examples

Place Moonshot orders if NYSE is open now:

quantrocket master isopen NYSE && quantrocket moonshot orders my-strategy | quantrocket blotter order -f -

Collect historical data for Australian stocks if the exchange was open 4 hours ago:

quantrocket master isopen ASX --ago 4h && quantrocket history collect asx-stk-1d

Log a message if the London Stock Exchange will be open in 30 minutes:

quantrocket master isopen LSE --in 30min && quantrocket flightlog log 'the market opens soon!'

isclosed

assert that one or more exchanges are closed and exit non-zero if open

quantrocket master isclosed [-h] [-t SEC_TYPE] [-i TIMEDELTA | -a TIMEDELTA]
                            [-s FREQ | -u FREQ] [-o]
                            EXCHANGE [EXCHANGE ...]

Positional Arguments

EXCHANGE

the exchange(s) to check

Named Arguments

-t, --sec-type

Possible choices: STK, FUT, CASH, OPT

the security type, if needed to disambiguate for exchanges that trade multiple security types. Possible choices: [‘STK’, ‘FUT’, ‘CASH’, ‘OPT’]

-i, --in

assert that exchanges will be closed at this point in the future (use Pandas timedelta string, e.g. 2h or 30min or 1d)

-a, --ago

assert that exchanges were closed this long ago (use Pandas timedelta string, e.g. 2h or 30min or 1d)

-s, --since

assert that exchanges have been closed (as of –in or –ago if applicable) since at least this time (use Pandas frequency string, e.g. ‘W’ (week end), ‘M’ (month end), ‘Q’ (quarter end), ‘A’ (year end))

-u, --until

assert that exchanges will be closed (as of –in or –ago if applicable) until at least this time (use Pandas frequency string, e.g. ‘W’ (week end), ‘M’ (month end), ‘Q’ (quarter end), ‘A’ (year end))

-o, --outside-rth

check extended hours calendar (default is to check regular trading hours calendar)

Default: False

Assert that one or more exchanges are closed and exit non-zero if open.

Intended to be used as a conditional for running other commands.

For –since/–until options, pass a Pandas frequency string, i.e. any string that is a valid freq argument to pd.date_range. See: https://pandas.pydata.org/pandas-docs/stable/user_guide/timeseries.html#offset-aliases https://pandas.pydata.org/pandas-docs/stable/user_guide/timeseries.html#anchored-offsets

Notes

Usage Guide:

Examples

Place Moonshot orders if the NYSE will be closed NYSE in 1 hour:

quantrocket master isclosed NYSE --in 1h && quantrocket moonshot orders my-strategy | quantrocket blotter order -f -

Collect historical data for Australian stocks if the exchange is closed now but was open 4 hours ago:

quantrocket master isclosed ASX && quantrocket master isopen ASX --ago 4h && quantrocket history collect asx-stk-1d

Place Moonshot orders if the NYSE has been closed since month end:

quantrocket master isclosed NYSE --since M && quantrocket moonshot orders monthly-rebalancing-strategy | quantrocket blotter order -f -

Place Moonshot orders if the NYSE will be closed in 1 hour and remain closed through quarter end:

quantrocket master isclosed NYSE --in 1H --until Q && quantrocket moonshot orders end-of-quarter-strategy | quantrocket blotter order -f -

ticksize

round prices in a CSV to valid tick sizes

quantrocket master ticksize [-h] -f INFILE -r FIELD [FIELD ...] [-d DIRECTION]
                            [-a] [-o OUTFILE]

Named Arguments

-f, --infile

CSV file with prices to be rounded (specify ‘-’ to read file from stdin)

-r, --round

columns to be rounded

-d, --how

Possible choices: up, down, nearest

which direction to round to. Possible choices: up, down, nearest (default is ‘nearest’)

-a, --append-ticksize

append a column of tick sizes for each field to be rounded

Default: False

-o, --outfile

filename to write the data to (default is stdout)

Round prices in a CSV file to valid tick sizes.

CSV should contain columns Sid, Exchange, and the columns to be rounded (e.g. LmtPrice). Additional columns will be ignored and returned unchanged.

Notes

Usage Guide:

Examples

Round the LmtPrice column in a CSV of orders and return a new CSV:

quantrocket master ticksize -f orders.csv --round LmtPrice -o rounded_orders.csv

Round the StopPrice column in a CSV of orders and append the tick size as a new column (called StopPriceTickSize):

quantrocket master ticksize -f orders.csv -r StopPrice --append-ticksize -o rounded_orders.csv

Round the LmtPrice column in a CSV of Moonshot orders then place the orders:

quantrocket moonshot orders umd-japan | quantrocket master ticksize -f - -r LmtPrice | quantrocket blotter order -f -
quantrocket.master.list_ibkr_exchanges(regions=None, sec_types=None)

List exchanges by security type and country as found on the IBKR website.

Parameters:
  • regions (list of str, optional) – limit to these regions. Possible choices: north_america, europe, asia, global

  • sec_types (list of str, optional) – limit to these securitiy types. Possible choices: STK, ETF, FUT, CASH, IND

Return type:

dict

Notes

Usage Guide:

quantrocket.master.collect_alpaca_listings()

Collect securities listings from Alpaca and store in securities master database.

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.master.collect_edi_listings(exchanges=None)

Collect securities listings from EDI and store in securities master database.

Parameters:

exchanges (list or str, required) – collect listings for these exchanges (identified by MICs)

Returns:

status message

Return type:

dict

Notes

Usage Guide:

Examples

Collect sample listings:

>>> collect_edi_listings(exchanges="FREE")

Collect listings for all permitted exchanges:

>>> collect_edi_listings()

Collect all Chinese stock listings:

>>> collect_edi_listings(exchanges=["XSHG", "XSHE"])
quantrocket.master.collect_figi_listings()

Collect securities listings from Bloomberg OpenFIGI and store in securities master database.

OpenFIGI provides several useful security attributes including market sector, a detailed security type, and share class-level FIGI identifier.

The collected data fields show up in the master file with the prefix “figi_*”.

This function does not directly query the OpenFIGI API but rather downloads a dump of all FIGIs which QuantRocket has previously mapped to securities from other vendors.

Returns:

status message

Return type:

dict

Notes

Usage Guide:

Examples

Collect all available FIGI listings:

>>> collect_figi_listings()
quantrocket.master.collect_ibkr_listings(exchanges=None, sec_types=None, currencies=None, symbols=None, universes=None, sids=None)

Collect securities listings from Interactive Brokers and store in securities master database.

Specify an exchange (optionally filtering by security type, currency, and/or symbol) to collect listings from the IBKR website and collect associated contract details from the IBKR API. Or, specify universes or sids to collect details from the IBKR API, bypassing the website.

Parameters:
  • exchanges (list or str) – one or more IBKR exchange codes to collect listings for (required unless providing universes or sids). For sample data use exchange code ‘FREE’

  • sec_types (list of str, optional) – limit to these security types. Possible choices: STK, ETF, FUT, CASH, IND

  • currencies (list of str, optional) – limit to these currencies

  • symbols (list of str, optional) – limit to these symbols

  • universes (list of str, optional) – limit to these universes

  • sids (list of str, optional) – limit to these sids

Returns:

status message

Return type:

dict

Notes

Usage Guide:

Examples

Collect free sample listings:

>>> collect_ibkr_listings(exchanges="FREE")

Collect all Toronto Stock Exchange stock listings:

>>> collect_ibkr_listings(exchanges="TSE", sec_types="STK")

Collect all NYSE ARCA ETF listings:

>>> collect_ibkr_listings(exchanges="ARCA", sec_types="ETF")

Collect specific symbols from Nasdaq:

>>> collect_ibkr_listings(exchanges="NASDAQ", symbols=["AAPL", "GOOG", "NFLX"])

Re-collect contract details for an existing universe called “japan-fin”:

>>> collect_ibkr_listings(universes="japan-fin")
quantrocket.master.collect_sharadar_listings(countries='US')

Collect securities listings from Sharadar and store in securities master database.

Parameters:

countries (str, required) – country to collect listings for. Possible choices: US, FREE

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.master.collect_usstock_listings()

Collect US stock listings from QuantRocket and store in securities master database.

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.master.collect_ibkr_option_chains(universes=None, sids=None, infilepath_or_buffer=None)

Collect IBKR option chains for underlying securities.

Note: option chains often consist of hundreds, sometimes thousands of options per underlying security. Be aware that requesting option chains for large universes of underlying securities, such as all stocks on the NYSE, can take numerous hours to complete.

Parameters:
  • universes (list of str, optional) – collect options for these universes of underlying securities

  • sids (list of str, optional) – collect options for these underlying sids

  • infilepath_or_buffer (str or file-like object, optional) – collect options for the sids in this file (specify ‘-’ to read file from stdin)

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.master.diff_ibkr_securities(universes=None, sids=None, infilepath_or_buffer=None, fields=None, delist_missing=False, delist_exchanges=None, wait=False)

Flag security details that have changed in IBKR’s system since the time they were last collected into the securities master database.

Diff can be run synchronously or asynchronously (asynchronous is the default and is recommended if diffing more than a handful of securities).

Parameters:
  • universes (list of str, optional) – limit to these universes

  • sids (list of str, optional) – limit to these sids

  • infilepath_or_buffer (str or file-like object, optional) – limit to the sids in this file (specify ‘-’ to read file from stdin)

  • fields (list of str, optional) – only diff these fields (field name should start with “ibkr”)

  • delist_missing (bool) – auto-delist securities that are no longer available from IBKR

  • delist_exchanges (list of str, optional) – auto-delist securities that are associated with these exchanges

  • wait (bool) – run the diff synchronously and return the diff (otherwise run asynchronously and log the results, if any, to flightlog)

Returns:

dict of sids and fields that have changed (if wait), or status message

Return type:

dict

Notes

Usage Guide:

quantrocket.master.download_master_file(filepath_or_buffer=None, output='csv', exchanges=None, sec_types=None, currencies=None, universes=None, symbols=None, sids=None, exclude_universes=None, exclude_sids=None, exclude_delisted=False, exclude_expired=False, frontmonth=False, vendors=None, fields=None)

Query security details from the securities master database and download to file.

Parameters:
  • filepath_or_buffer (str or file-like object) – filepath to write the data to, or file-like object (defaults to stdout)

  • output (str) – output format (csv or json, default is csv)

  • exchanges (list of str, optional) – limit to these exchanges. You can specify exchanges using the MIC or the vendor’s exchange code.

  • sec_types (list of str, optional) – limit to these security types. Possible choices: STK, ETF, FUT, CASH, IND, OPT, FOP, BAG

  • currencies (list of str, optional) – limit to these currencies

  • universes (list of str, optional) – limit to these universes

  • symbols (list of str, optional) – limit to these symbols

  • sids (list of str, optional) – limit to these sids

  • exclude_universes (list of str, optional) – exclude these universes

  • exclude_sids (list of str, optional) – exclude these sids

  • exclude_delisted (bool) – exclude delisted securities (default is to include them)

  • exclude_expired (bool) – exclude expired contracts (default is to include them)

  • frontmonth (bool) – exclude backmonth and expired futures contracts (default False)

  • vendors (list of str, optional) – limit to these vendors. Possible choices: alpaca, edi, ibkr, sharadar, usstock

  • fields (list of str, optional) – Return specific fields. By default a core set of fields is returned, but additional vendor-specific fields are also available. To return non-core fields, you can reference them by name, or pass “*” to return all available fields. To return all fields for a specific vendor, pass the vendor prefix followed by “*”, for example “edi*” for all EDI fields. Pass “?*” (or any invalid vendor prefix plus “*”) to see available vendor prefixes. Pass “?” or any invalid fieldname to see all available fields.

Return type:

None

See also

get_securities

load securities into a DataFrame

Notes

Parameters for filtering query results are combined according to the following rules. First, the master service determines what to include in the result set, based on the inclusion filters: exchanges, sec_types, currencies, universes, symbols, and sids. With the exception of sids, these parameters are ANDed together. That is, securities must satisfy all of the parameters to be included. If vendors is provided, only those vendors are searched for the purpose of determining matches.

The sids parameter is treated differently. Securities matching sids are always included, regardless of whether they meet the other inclusion criteria.

After determining what to include, the master service then applies the exclusion filters (exclude_sids, exclude_universes, exclude_delisted, exclude_expired, and frontmonth) to determine what (if anything) should be removed from the result set. Exclusion filters are ORed, that is, securities are excluded if they match any of the exclusion criteria.

Usage Guide:

Examples

Download NYSE and NASDAQ securities to file, using MICs to specify the exchanges:

>>> download_master_file("securities.csv", exchanges=["XNYS","XNAS"])

Download NYSE and NASDAQ securities to file, using IBKR exchange codes to specify the exchanges, and include all IBKR fields:

>>> download_master_file("securities.csv", exchanges=["NYSE","NASDAQ"], fields="ibkr*")

Download securities for a particular universe to in-memory file, including all possible fields, and load the CSV into pandas.

>>> f = io.StringIO()
>>> download_master_file(f, fields="*", universes="my-universe")
>>> securities = pd.read_csv(f)
quantrocket.master.get_securities(symbols=None, exchanges=None, sec_types=None, currencies=None, universes=None, sids=None, exclude_universes=None, exclude_sids=None, exclude_delisted=False, exclude_expired=False, frontmonth=False, vendors=None, fields=None)

Return a DataFrame of security details from the securities master database.

Parameters:
  • symbols (list of str, optional) – limit to these symbols

  • exchanges (list of str, optional) – limit to these exchanges. You can specify exchanges using the MIC or the vendor’s exchange code.

  • sec_types (list of str, optional) – limit to these security types. Possible choices: STK, ETF, FUT, CASH, IND, OPT, FOP, BAG

  • currencies (list of str, optional) – limit to these currencies

  • universes (list of str, optional) – limit to these universes

  • sids (list of str, optional) – limit to these sids

  • exclude_universes (list of str, optional) – exclude these universes

  • exclude_sids (list of str, optional) – exclude these sids

  • exclude_delisted (bool) – exclude delisted securities (default is to include them)

  • exclude_expired (bool) – exclude expired contracts (default is to include them)

  • frontmonth (bool) – exclude backmonth and expired futures contracts (default False)

  • vendors (list of str, optional) – limit to these vendors. Possible choices: alpaca, edi, ibkr, sharadar, usstock

  • fields (list of str, optional) – Return specific fields. By default a core set of fields is returned, but additional vendor-specific fields are also available. To return non-core fields, you can reference them by name, or pass “*” to return all available fields. To return all fields for a specific vendor, pass the vendor prefix followed by “*”, for example “edi*” for all EDI fields. Pass “?*” (or any invalid vendor prefix plus “*”) to see available vendor prefixes. Pass “?” or any invalid fieldname to see all available fields.

Returns:

a DataFrame of securities, with Sids as the index

Return type:

DataFrame

Notes

Parameters for filtering query results are combined according to the following rules. First, the master service determines what to include in the result set, based on the inclusion filters: exchanges, sec_types, currencies, universes, symbols, and sids. With the exception of sids, these parameters are ANDed together. That is, securities must satisfy all of the parameters to be included. If vendors is provided, only those vendors are searched for the purpose of determining matches.

The sids parameter is treated differently. Securities matching sids are always included, regardless of whether they meet the other inclusion criteria.

After determining what to include, the master service then applies the exclusion filters (exclude_sids, exclude_universes, exclude_delisted, exclude_expired, and frontmonth) to determine what (if anything) should be removed from the result set. Exclusion filters are ORed, that is, securities are excluded if they match any of the exclusion criteria.

Usage Guide:

Examples

Load default fields for NYSE and NASDAQ securities, using MICs to specify the exchanges:

>>> securities = get_securities(exchanges=["XNYS","XNAS"])

Load sids for MSFT and AAPL: >>> sids = get_securities(symbols=[“MSFT”, “AAPL”]).index.tolist()

Load NYSE and NASDAQ securities, using IBKR exchange codes to specify the exchanges, and include all IBKR fields:

>>> securities = get_securities(exchanges=["NYSE","NASDAQ"], fields="ibkr*")
quantrocket.master.get_securities_reindexed_like(reindex_like, fields=None)

Return a multiindex DataFrame of securities master data, reindexed to match the index and columns (sids) of reindex_like.

Parameters:
  • reindex_like (DataFrame, required) – a DataFrame (usually of prices) with dates for the index and sids for the columns, to which the shape of the resulting DataFrame will be conformed

  • fields (list of str) – a list of fields to include in the resulting DataFrame. By default a core set of fields is returned, but additional vendor-specific fields are also available. To return non-core fields, you can reference them by name, or pass “*” to return all available fields. To return all fields for a specific vendor, pass the vendor prefix followed by “*”, for example “edi*” for all EDI fields. Pass “?*” (or any invalid vendor prefix plus “*”) to see available vendor prefixes. Pass “?” or any invalid fieldname to see all available fields.

Returns:

a multiindex (Field, Date) DataFrame of securities master data, shaped like the input DataFrame

Return type:

DataFrame

Notes

Usage Guide:

Examples

Get exchanges (MICs) using a DataFrame of prices:

>>> closes = prices.loc["Close"]
>>> securities = get_securities_reindexed_like(
        closes, fields=["Exchange"])
>>> exchanges = securities.loc["Exchange"]
>>> nyse_closes = closes.where(exchanges == "XNYS")
quantrocket.master.get_contract_nums_reindexed_like(reindex_like, limit=5)

From a DataFrame of futures (with dates as the index and sids as columns), return a DataFrame of integers representing each sid’s sequence in the futures chain as of each date, where 1 is the front contract, 2 is the second nearest contract, etc.

Sequences are based on the RolloverDate field in the securities master file, which is based on configurable rollover rules.

Parameters:
  • reindex_like (DataFrame, required) – a DataFrame (usually of prices) with dates for the index and sids for the columns, to which the shape of the resulting DataFrame will be conformed

  • limit (int) – how many contracts ahead to sequence. For example, assuming quarterly contracts, a limit of 5 will sequence 5 quarters out. Default 5.

Returns:

a DataFrame of futures chain sequence numbers, shaped like the input DataFrame

Return type:

DataFrame

Notes

Usage Guide:

Examples

Get a Boolean mask of front-month contracts:

>>> closes = prices.loc["Close"]
>>> contract_nums = get_contract_nums_reindexed_like(closes)
>>> are_front_months = contract_nums == 1
quantrocket.master.create_universe(code, infilepath_or_buffer=None, sids=None, from_universes=None, exclude_delisted=False, append=False, replace=False)

Create a universe of securities.

Parameters:
  • code (str, required) – the code to assign to the universe (lowercase alphanumerics and hyphens only)

  • infilepath_or_buffer (str or file-like object, optional) – create the universe from the sids in this file (specify ‘-’ to read file from stdin)

  • sids (list of str, optional) – create the universe from these sids

  • from_universes (list of str, optional) – create the universe from these existing universes

  • exclude_delisted (bool) – exclude delisted securities and expired contracts that would otherwise be included (default is not to exclude them)

  • append (bool) – append to universe if universe already exists (default False)

  • replace (bool) – replace universe if universe already exists (default False)

Returns:

status message

Return type:

dict

Notes

Usage Guide:

Examples

Create a universe called ‘nyse-stk’ from a CSV file:

>>> create_universe("usa-stk", "nyse_securities.csv")

Create a universe from a DataFrame of securities:

>>> securities = get_securities(exchanges="TSEJ")
>>> create_universe("japan-stk", sids=securities.index.tolist())
quantrocket.master.delete_universe(code)

Delete a universe.

The listings details of the member securities won’t be deleted, only their grouping as a universe.

Parameters:

code (str, required) – the universe code

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.master.list_universes()

List universes and their size.

Returns:

dict of universe:size

Return type:

dict

Notes

Usage Guide:

quantrocket.master.delist_ibkr_security(sid=None, symbol=None, exchange=None, currency=None, sec_type=None)

Mark an IBKR security as delisted.

This does not remove any data but simply marks the security as delisted so that data services won’t attempt to collect data for the security and so that the security can be optionally excluded from query results.

The security can be specified by sid or a combination of other parameters (for example, symbol + exchange). As a precaution, the request will fail if the parameters match more than one security.

Parameters:
  • sid (str, optional) – the sid of the security to be delisted

  • symbol (str, optional) – the symbol to be delisted (if sid not provided)

  • exchange (str, optional) – the exchange of the security to be delisted (if needed to disambiguate)

  • currency (str, optional) – the currency of the security to be delisted (if needed to disambiguate)

  • sec_type (str, optional) – the security type of the security to be delisted (if needed to disambiguate). Possible choices: STK, ETF, FUT, CASH, IND

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.master.create_ibkr_combo(combo_legs)

Create an IBKR combo (aka spread), which is a composite instrument consisting of two or more individual instruments (legs) that are traded as a single instrument.

Each user-defined combo is stored in the securities master database with a SecType of “BAG”. The combo legs are stored in the ComboLegs field as a JSON array. QuantRocket assigns a sid for the combo consisting of a prefix ‘IC’ followed by an autoincrementing digit, for example: IC1, IC2, IC3, …

If the combo already exists, its sid will be returned instead of creating a duplicate record.

Parameters:

combo_legs (list, required) – a list of the combo legs, where each leg is a list specifying action, ratio, and sid

Returns:

returns a dict containing the generated sid of the combo, and whether a new record was created

Return type:

dict

Notes

Usage Guide:

Examples

To create a calendar spread on VX, first retrieve the sids of the legs:

>>> from quantrocket.master import download_master_file
>>> download_master_file("vx.csv", symbols="VIX", exchanges="CFE", sec_types="FUT")
>>> vx_sids = pd.read_csv("vx.csv", index_col="Symbol").Sid.to_dict()

Then create the combo:

>>> create_ibkr_combo([
        ["BUY", 1, vx_sids["VXV9"]],
        ["SELL", 1, vx_sids["VXQ9"]]
    ])
    {"sid": IC1, "created": True}
quantrocket.master.collect_ibkr_calendar(exchanges=None)

Collect upcoming trading hours from IBKR for exchanges and save to securities master database.

Parameters:

exchanges (list of str, optional) – limit to these exchanges

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.master.list_calendar_statuses(exchanges, sec_type=None, in_=None, ago=None, outside_rth=False)

Check whether exchanges are open or closed.

Parameters:
  • exchanges (list of str, required) – the exchange(s) to check

  • sec_type (str, optional) – the security type, if needed to disambiguate for exchanges that trade multiple security types. Possible choices: STK, FUT, CASH, OPT

  • in (str, optional) – check whether exchanges will be open or closed at this point in the future (use Pandas timedelta string, e.g. 2h or 30min or 1d)

  • ago (str, optional) – check whether exchanges were open or closed this long ago (use Pandas timedelta string, e.g. 2h or 30min or 1d)

  • outside_rth (bool) – check extended hours calendar (default is to check regular trading hours calendar)

Returns:

exchange calendar status

Return type:

dict

Notes

Usage Guide:

quantrocket.master.round_to_tick_sizes(infilepath_or_buffer, round_fields, how=None, append_ticksize=False, outfilepath_or_buffer=None)

Round prices in a CSV file to valid tick sizes.

CSV should contain columns Sid, Exchange, and the columns to be rounded (e.g. LmtPrice). Additional columns will be ignored and returned unchanged.

Parameters:
  • infilepath_or_buffer (str or file-like object, required) – CSV file with prices to be rounded (specify ‘-’ to read file from stdin)

  • round_fields (list of str, required) – columns to be rounded

  • how (str, optional) – which direction to round to. Possible choices: ‘up’, ‘down’, ‘nearest’ (default is ‘nearest’)

  • append_ticksize (bool) – append a column of tick sizes for each field to be rounded (default False)

  • outfilepath_or_buffer (str or file-like object) – filepath to write the data to, or file-like object (defaults to stdout)

Return type:

None

Notes

Usage Guide:

 

Securities Master API

Resource Group

Exchanges

List Exchanges
GET/master/exchanges/{vendor}{?regions,sec_types}

List exchanges.

Example URI

GET http://houston/master/exchanges/ibkr?regions=asia&sec_types=STK
URI Parameters
vendor
str (required) Example: ibkr

the vendor to list exchanges for

Choices: ibkr

regions
str (optional) Example: asia

limit to these regions (pass multiple times for multiple regions)

Choices: north_america europe asia global

sec_types
str (optional) Example: STK

limit to these security types (pass multiple times for multiple security types)

Choices: STK ETF FUT CASH IND

Response  200
Headers
Content-Type: application/json
Body
{
  "STK": {
    "Australia": [
      "ASX",
      "CHIXAU"
    ],
    "Hong Kong": [
      "SEHK",
      "SEHKNTL",
      "SEHKSZSE"
    ],
    "India": [
      "NSE"
    ],
    "Japan": [
      "CHIXJ",
      "JPNNEXT",
      "TSEJ"
    ],
    "Singapore": [
      "SGX"
    ]
  }
}

Securities

Collect Listings
POST/master/securities/{vendor}{?exchanges,countries,sec_types,currencies,symbols,universes,sids}

Collect securities listings from a vendor and store in securities master database.

Not all parameters are applicable to all vendors. Please see the Python API reference to determine which parameters are applicable to which vendors.

Example URI

POST http://houston/master/securities/usstock?exchanges=XNYS&countries=US&sec_types=STK&currencies=USD&symbols=LVS&universes=my-universe&sids=FI123456
URI Parameters
vendor
str (required) Example: usstock

the vendor to collect data from

Choices: alpaca edi figi ibkr sharadar usstock

exchanges
str (optional) Example: XNYS

the exchange code to collect listings for (required unless providing universes or sids) (pass multiple times for multiple sids)

sec_types
str (optional) Example: STK

limit to these security types (pass multiple times for multiple security types)

Choices: STK ETF FUT CASH IND

currencies
str (optional) Example: USD

limit to these currencies (pass multiple times for multiple currencies)

symbols
str (optional) Example: LVS

limit to these symbols (pass multiple times for multiple symbols)

universes
str (optional) Example: my-universe

limit to these universes (pass multiple times for multiple universes)

sids
str (optional) Example: FI123456

limit to these sids (pass multiple times for multiple sids)

countries
str (optional) Example: US

countries to collect listings for

Choices: US FREE

Response  202
Headers
Content-Type: application/json
Body
{
  "status": "the listing details will be collected asynchronously"
}

Download Master File
GET/master/securities/{output}{?exchanges,sec_types,currencies,symbols,universes,sids,exclude_universes,exclude_sids,exclude_delisted,exclude_expired,vendors,fields,frontmonth}

Query security details from the securities master database and download to file.

Parameters for filtering query results are combined according to the following rules. First, the master service determines what to include in the result set, based on the inclusion filters: exchanges, sec_types, currencies, universes, symbols, and sids. With the exception of sids, these parameters are ANDed together. That is, securities must satisfy all of the parameters to be included. If vendors is provided, only those vendors are searched for the purpose of determining matches.

The sids parameter is treated differently. Securities matching sids are always included, regardless of whether they meet the other inclusion criteria.

After determining what to include, the master service then applies the exclusion filters (exclude_sids, exclude_universes, exclude_delisted, exclude_expired, and frontmonth) to determine what (if anything) should be removed from the result set. Exclusion filters are ORed, that is, securities are excluded if they match any of the exclusion criteria.

Example URI

GET http://houston/master/securities/.csv?exchanges=NYSE&sec_types=STK&currencies=USD&symbols=LVS&universes=my-universe&sids=FI123456&exclude_universes=another-universe&exclude_sids=234567&exclude_delisted=false&exclude_expired=false&vendors=usstock&fields=*&frontmonth=false
URI Parameters
output
str (required) Example: .csv

output format

Choices: .csv .json

exchanges
str (optional) Example: NYSE

limit to these exchanges. You can specify exchanges using the MIC or the vendor’s exchange code. (pass multiple times for multiple exchanges)

sec_types
str (optional) Example: STK

limit to these security types (pass multiple times for multiple security types)

Choices: STK ETF FUT CASH IND OPT FOP BAG

currencies
str (optional) Example: USD

limit to these currencies (pass multiple times for multiple currencies)

symbols
str (optional) Example: LVS

limit to these symbols (pass multiple times for multiple symbols)

universes
str (optional) Example: my-universe

limit to these universes (pass multiple times for multiple universes)

sids
str (optional) Example: FI123456

limit to these sids (pass multiple times for multiple sids)

exclude_universes
str (optional) Example: another-universe

exclude these universes (pass multiple times for multiple universes)

exclude_sids
str (optional) Example: 234567

exclude these sids (pass multiple times for multiple sids)

exclude_delisted
bool (required) Example: false

exclude delisted securities (default is to include them)

exclude_expired
bool (required) Example: false

exclude expired contracts (default is to include them)

frontmonth
bool (required) Example: false

exclude backmonth and expired futures contracts (default False)

vendors
str (optional) Example: usstock

limit to these vendors (pass multiple times for multiple vendors)

Choices: alpaca edi ibkr sharadar usstock

fields
str (optional) Example: *

Return specific fields. By default a core set of fields is returned, but additional vendor-specific fields are also available. To return non-core fields, you can reference them by name, or pass * to return all available fields. To return all fields for a specific vendor, pass the vendor prefix followed by * , for example edi* for all EDI fields. Pass ?* (or any invalid vendor prefix plus * ) to see available vendor prefixes. Pass “?” or any invalid fieldname to see all available fields.

Response  200
Headers
Content-Type: text/csv
Body
279016041,NSC,FUT,0,ONE,USD,NSC3MM7,NSC3M,NSC3M,"NORFOLK SOUTHERN CORP",EST,Industrial,Transportation,Transport-Rail,0.01,1,1,20170619,201706,100,,0
279016083,NUS,FUT,0,ONE,USD,NUS3MM7,NUS3M,NUS3M,"NU SKIN ENTERPRISES INC - A",EST,"Consumer, Cyclical",Retail,"Multilevel Dir Selling",0.01,1,1,20170619,201706,100,,0
279016106,O,FUT,0,ONE,USD,O3MM7,O3M,O3M,"REALTY INCOME CORP",EST,Financial,REITS,"REITS-Single Tenant",0.01,1,1,20170619,201706,100,,0

Delist Security
DELETE/master/securities/{vendor}{?exchange,sec_type,currency,symbols,sids}

Mark an IBKR security as delisted.

This does not remove any data but simply marks the security as delisted so that data services won’t attempt to collect data for the security and so that the security can be optionally excluded from query results.

The security can be specified by sid or a combination of other parameters (for example, symbol + exchange). As a precaution, the request will fail if the parameters match more than one security.

Example URI

DELETE http://houston/master/securities/ibkr?exchange=NYSE&sec_type=STK&currency=USD&symbols=ABC&sids=FI123456
URI Parameters
vendor
str (required) Example: ibkr

the vendor to delist the security for.

Choices: ibkr

sids
str (optional) Example: FI123456

the sid of the security to be delisted

symbols
str (optional) Example: ABC

the symbol to be delisted (if sid not provided)

exchange
str (optional) Example: NYSE

the exchange of the security to be delisted (if needed to disambiguate)

sec_type
str (optional) Example: STK

the security type of the security to be delisted (if needed to disambiguate)

Choices: STK ETF FUT CASH IND

currency
str (optional) Example: USD

the currency of the security to be delisted (if needed to disambiguate)

Response  200
Headers
Content-Type: application/json
Body
{
  "msg": "delisted sid FI123456"
}

Options

Collect Option Chains
POST/master/options/ibkr{?universes,sids}

Collect IBKR option chains for underlying securities. Specify sids or universes or upload a CSV of sids as the request body.

Example URI

POST http://houston/master/options/ibkr?universes=my-universe&sids=FI123456
URI Parameters
universes
str (optional) Example: my-universe

collect options for these universes of underlying securities (pass multiple times for multiple universes)

sids
str (optional) Example: FI123456

collect options for these underlying sids (pass multiple times for multiple sids)

Request
Headers
Content-Type: text/csv
Body
Sid,OtherField,OtherField2
FI123456,other,fields
FI234567,will,be
FI345678,ignored,
Response  202
Headers
Content-Type: application/json
Body
{
  "status": "the option chains will be collected asynchronously"
}

Diff

Diff Securities
GET/master/diff/ibkr{?universes,sids,fields,delist_missing,delist_exchanges,wait}

Flag security details that have changed in IBKR’s system since the time they were last loaded into the securities master database. Specify sids or universes or upload a CSV of sids as the request body.

Can be run synchronously or asynchronously (asynchronous is the default and is recommended if diffing more than a handful of securities).

Example URI

GET http://houston/master/diff/ibkr?universes=my-universe&sids=FI123456&fields=ibkr_PrimaryExchange&delist_missing=true&delist_exchanges=PINK&wait=true
URI Parameters
universes
str (optional) Example: my-universe

limit to these universes (pass multiple times for multiple universes)

sids
str (optional) Example: FI123456

limit to these sids (pass multiple times for multiple sids)

fields
str (optional) Example: ibkr_PrimaryExchange

only diff these fields (pass multiple times for multiple fields)

delist_missing
bool (optional) Example: true

auto-delist securities that are no longer available from IBKR

delist_exchanges
str (optional) Example: PINK

auto-delist securities that are associated with these exchanges

wait
bool (optional) Example: true

run the diff synchronously and return the diff (otherwise run asynchronously and log the results, if any, to flightlog)

Request
Headers
Content-Type: text/csv
Body
Sid,OtherField,OtherField2
FI123456,other,fields
FI234567,will,be
FI345678,ignored,
Response  200
Headers
Content-Type: application/json
Body
{
  "security_diffs": [
    {
      "as_stored_in_db": {
        "Symbol": "ABC",
        "Currency": "USD",
        "SecType": "STK",
        "PrimaryExchange": "NYSE",
        "Sid": "FI123456"
      },
      "changes_in_ibkr": {
        "PrimaryExchange": {
          "new": "PINK",
          "old": "NYSE"
        }
      }
    }
  ]
}
Response  202
Headers
Content-Type: application/json
Body
{
  "status": "the diff, if any, will be logged to flightlog asynchronously"
}

Universes

List Universes
GET/master/universes/

List universes and their size.

Example URI

GET http://houston/master/universes/
Response  200
Headers
Content-Type: application/json
Body
{
  "jpn-fut": 196,
  "fx": 98,
  "arca-etf": 1514,
  "asx-stk": 2313
}

Create Universe
PUT/master/universes/{code}{?from_universes,exclude_delisted,replace}

Create a universe of securities, either from existing universes and/or from a CSV containing sids and uploaded as the request body.

Example URI

PUT http://houston/master/universes/arca-etf?from_universes=some-universe&exclude_delisted=true&replace=true
URI Parameters
code
str (required) Example: arca-etf

the code to assign to the universe (lowercase alphanumerics and hyphens only)

from_universes
str (optional) Example: some-universe

create the universe from these existing universes (pass multiple times for multiple existing universes)

exclude_delisted
bool (required) Example: true

exclude delisted securities that would otherwise be included (default is not to exclude them)

replace
bool (required) Example: true

replace universe if universe already exists (default is to fail if universe already exists)

Request
Headers
Content-Type: text/csv
Body
Sid,OtherField,OtherField2
FI123456,other,fields
FI234567,will,be
FI345678,ignored,
Response  200
Headers
Content-Type: application/json
Body
{
  "code": "arca-etf",
  "provided": 1514,
  "inserted": 1514,
  "total_after_insert": 1514
}

Append to Universe
PATCH/master/universes/{code}{?from_universes,exclude_delisted}

Append to a universe of securities, either from existing universes and/or from a CSV containing sids and uploaded as the request body.

Example URI

PATCH http://houston/master/universes/arca-etf?from_universes=some-universe&exclude_delisted=true
URI Parameters
code
str (required) Example: arca-etf

the code to assign to the universe (lowercase alphanumerics and hyphens only)

from_universes
str (optional) Example: some-universe

create the universe from these existing universes (pass multiple times for multiple existing universes)

exclude_delisted
bool (required) Example: true

exclude delisted securities that would otherwise be included (default is not to exclude them)

Request
Headers
Content-Type: text/csv
Body
Sid,OtherField,OtherField2
FI123456,other,fields
FI234567,will,be
FI345678,ignored,
Response  200
Headers
Content-Type: application/json
Body
{
  "code": "arca-etf",
  "provided": 1534,
  "inserted": 30,
  "total_after_insert": 1534
}

Delete Universe
DELETE/master/universes/{code}

Delete a universe. (The listings details of the member securities won’t be deleted, only their grouping as a universe).

Example URI

DELETE http://houston/master/universes/arca-etf
URI Parameters
code
str (required) Example: arca-etf

the universe code

Response  200
Headers
Content-Type: application/json
Body
{
  "code": "arca-etf",
  "deleted": 1534
}

Combos

Create IBKR Combo
PUT/master/combos/ibkr

Create an IBKR combo (aka spread), which is a composite instrument consisting of two or more individual instruments (legs) that are traded as a single instrument.

Each user-defined combo is stored in the securities master database with a SecType of “BAG”. The combo legs are stored in the ComboLegs field as a JSON array. QuantRocket assigns a sid for the combo consisting of a prefix ‘IC’ followed by an autoincrementing digit, for example: IC1, IC2, IC3, …

If the combo already exists, its sid will be returned instead of creating a duplicate record.

Example URI

PUT http://houston/master/combos/ibkr
Request
Headers
Content-Type: application/json
Body
[
  [
    "BUY",
    1,
    "FI12345"
  ],
  [
    "SELL",
    1,
    "FI23456"
  ]
]
Response  200
Headers
Content-Type: application/json
Body
{
    "sid": IC1,
    "created": true
}

Rollover Config

Get Rollover Config
GET/master/config/rollover

Returns the current rollover rules config.

Example URI

GET http://houston/master/config/rollover
Response  200
Headers
Content-Type: application/json
Body
{
  "CME": {
    "ES": {
      "rollrule": {
        "days": -8
      },
      "same_for": [
        "NQ",
        "RS",
        "YM"
      ]
    },
    "HE": {
      "only_months": [
        2,
        4,
        6,
        8,
        10,
        12
      ],
      "rollrule": {
        "months": -1,
        "day": 27
      },
      "same_for": [
        "LE"
      ]
    }
  },
  "NYMEX": {
    "CL": {
      "rollrule": {
        "days": -1
      },
      "same_for": [
        "NG"
      ]
    }
  }
}

Load Rollover Config
PUT/master/config/rollover

Upload a new rollover rules config.

Example URI

PUT http://houston/master/config/rollover
Request
Headers
Content-Type: application/x-yaml
Body
CME:
    ES:
        rollrule:
        days: -8
        same_for:
        - NQ
        - RS
        - YM
    HE:
        only_months:
        - 2
        - 4
        - 6
        - 8
        - 10
        - 12
        rollrule:
        months: -1
        day: 27
        same_for:
        - LE
NYMEX:
    CL:
        rollrule:
        days: -1
        same_for:
        - NG
Response  202
Headers
Content-Type: application/json
Body
{
  "status": "the config will be loaded asynchronously"
}

Calendars

Collect IBKR Calendars
POST/master/calendar/{vendor}{?exchanges}

Collect upcoming trading hours from IBKR for exchanges and save to securities master database.

Example URI

POST http://houston/master/calendar/ibkr?exchanges=ARCA
URI Parameters
vendor
str (required) Example: ibkr

the vendor to collect trading hours from

Choices: ibkr

exchanges
str (optional) Example: ARCA

limit to these exchanges. Pass multiple times for multiple exchanges.

Response  202
Headers
Content-Type: application/json
Body
{
  "status": "the trading hours will be collected asynchronously"
}

Get Exchange Status
GET/master/calendar/{?exchanges,sec_type,in,ago,outside_rth}

Check whether exchanges are open or closed.

Example URI

GET http://houston/master/calendar/?exchanges=ARCA&sec_type=STK&in=1h&ago=30min&outside_rth=false
URI Parameters
exchanges
str (required) Example: ARCA

the exchange(s) to check. Pass multiple times for multiple exchanges.

sec_type
str (optional) Example: STK

the security type, if needed to disambiguate for exchanges that trade multiple security types.

Choices: STK FUT CASH OPT

in
str (optional) Example: 1h

check whether exchanges will be open or closed at this point in the future (use Pandas timedelta string, e.g. 2h or 30min or 1d)

ago
str (optional) Example: 30min

check whether exchanges were open or closed this long ago (use Pandas timedelta string, e.g. 2h or 30min or 1d)

outside_rth
bool (optional) Example: false

check extended hours calendar (default is to check regular trading hours calendar)

Response  200
Headers
Content-Type: application/json
Body
{
  "ARCA": {
    "status": "open",
    "since": "2018-05-10T09:30:00",
    "until": "2018-05-10T16:00:00",
    "timezone": "America/New_York"
  }
}

Tick Sizes

Round prices
GET/master/ticksizes{?round_fields,how,append_ticksize}

Round prices in a CSV file to valid tick sizes.

CSV should contain columns Sid, Exchange, and the columns to be rounded (e.g. LmtPrice). Additional columns will be ignored and returned unchanged.

Example URI

GET http://houston/master/ticksizes?round_fields=LmtPrice&how=nearest&append_ticksize=true
URI Parameters
round_fields
str (required) Example: LmtPrice

columns to be rounded. Pass multiple times for multiple columns.

how
str (optional) Example: nearest

which direction to round to. Default ‘nearest’.

Choices: up down nearest

append_ticksize
boolean (optional) Example: true

append a column of tick sizes for each field to be rounded (default False)

Request
Headers
Content-Type: text/csv
Body
Sid,Account,Action,OrderRef,TotalQuantity,Exchange,OrderType,Tif,LmtPrice
FI13905888,DU12345,BUY,japan-strategy,1000,SMART,LMT,DAY,15203.1135
FI13905888,DDU12345,BUY,japan-strategy,1000,TSEJ,LMT,DAY,15203.1135
Response  200
Headers
Content-Type: text/csv
Body
Sid,Account,Action,OrderRef,TotalQuantity,Exchange,OrderType,Tif,LmtPrice
FI13905888,DU12345,BUY,japan-strategy,1000,SMART,LMT,DAY,15203.0
FI13905888,DDU12345,BUY,japan-strategy,1000,TSEJ,LMT,DAY,15205.0

quantrocket.moonshot

This API is for backtesting and live trading of Moonshot strategies. For writing Moonshot strategies, see the moonshot API.

QuantRocket Moonshot CLI

usage: quantrocket moonshot [-h] {backtest,paramscan,ml-walkforward,trade} ...

subcommands

subcommand

Possible choices: backtest, paramscan, ml-walkforward, trade

Sub-commands

backtest

backtest one or more strategies

quantrocket moonshot backtest [-h] [-s YYYY-MM-DD] [-e YYYY-MM-DD] [-g FREQ]
                              [-l [CODE:FLOAT ...]] [-n [CURRENCY:NLV ...]]
                              [-p [PARAM:VALUE ...]] [--no-cache] [-d] [--pdf]
                              [-o FILEPATH]
                              CODE [CODE ...]

Positional Arguments

CODE

one or more strategy codes

backtest options

-s, --start-date

the backtest start date (default is to use all available history)

-e, --end-date

the backtest end date (default is to use all available history)

-g, --segment

backtest in date segments of this size, to reduce memory usage (use Pandas frequency string, e.g. ‘A’ for annual segments or ‘Q’ for quarterly segments)

-l, --allocations

the allocation for each strategy, passed as ‘code:allocation’ (default allocation is 1.0 / number of strategies)

-n, --nlv

the NLV (net liquidation value, i.e. account balance) to assume for the backtest, expressed in each currency represented in the backtest (pass as ‘currency:nlv’)

-p, --params

one or more strategy params to set on the fly before backtesting (pass as ‘param:value’)

--no-cache

don’t use cached files even if available. Using cached files speeds up backtests but may be undesirable if underlying data has changed. See http://qrok.it/h/mcache to learn more about caching in Moonshot.

Default: False

output options

-d, --details

return detailed results for all securities instead of aggregating to strategy level (only supported for single-strategy backtests)

Default: False

--pdf

return a PDF performance tear sheet (default is to return a CSV of performance results)

-o, --outfile

the location to write the results file (omit to write to stdout)

Backtest one or more strategies.

By default returns a CSV of backtest results but can also return a PDF tear sheet of performance charts.

If testing multiple strategies, each column in the CSV represents a strategy. If testing a single strategy with the –details option, each column in the CSV represents a security in the strategy universe.

Notes

Usage Guide:

Examples

Backtest several HML (High Minus Low) strategies from 2005-2015 and return a CSV of results:

quantrocket moonshot backtest hml-us hml-eur hml-asia -s 2005-01-01 -e 2015-12-31 -o hml_results.csv

Backtest a single strategy called demo using all available history and return a PDF tear sheet:

quantrocket moonshot backtest demo --pdf -o tearsheet.pdf

Run a backtest in 1-year segments to reduce memory usage:

quantrocket moonshot backtest big-strategy -s 2000-01-01 -e 2018-01-01 --segment A -o results.csv

paramscan

run a parameter scan for one or more strategies

quantrocket moonshot paramscan [-h] [-s YYYY-MM-DD] [-e YYYY-MM-DD] [-g FREQ]
                               -p PARAM -v VALUE [VALUE ...] [--param2 PARAM]
                               [--vals2 [VALUE ...]] [-l [CODE:FLOAT ...]]
                               [-n [CURRENCY:NLV ...]]
                               [--params [PARAM:VALUE ...]]
                               [--num-workers INT] [--no-cache] [--pdf]
                               [-o FILEPATH]
                               CODE [CODE ...]

Positional Arguments

CODE

one or more strategy codes

backtest options

-s, --start-date

the backtest start date (default is to use all available history)

-e, --end-date

the backtest end date (default is to use all available history)

-g, --segment

backtest in date segments of this size, to reduce memory usage (use Pandas frequency string, e.g. ‘A’ for annual segments or ‘Q’ for quarterly segments)

-p, --param1

the name of the parameter to test (a class attribute on the strategy)

-v, --vals1

parameter values to test (values can be integers, floats, strings, ‘True’, ‘False’, ‘None’, or ‘default’ (to test current param value); for lists/tuples, use comma-separated values)

--param2

name of a second parameter to test (for 2-D parameter scans)

--vals2

values to test for parameter 2 (values can be integers, floats, strings, ‘True’, ‘False’, ‘None’, or ‘default’ (to test current param value); for lists/tuples, use comma-separated values)

-l, --allocations

the allocation for each strategy, passed as ‘code:allocation’ (default allocation is 1.0 / number of strategies)

-n, --nlv

the NLV (net liquidation value, i.e. account balance) to assume for the backtests, expressed in each currency represented in the backtest (pass as ‘currency:nlv’)

--params

one or more strategy params to set on the fly before running the parameter scan (pass as ‘param:value’)

--num-workers

the number of parallel workers to run. Running in parallel can speed up the parameter scan if your system has adequate resources. Default is 1, meaning no parallel processing.

--no-cache

don’t use cached files even if available. Using cached files speeds up backtests but may be undesirable if underlying data has changed. See http://qrok.it/h/mcache to learn more about caching in Moonshot.

Default: False

output options

--pdf

return a PDF tear sheet of results (default is to return a CSV)

-o, --outfile

the location to write the results file (omit to write to stdout)

Run a parameter scan for one or more strategies.

By default, returns a CSV of scan results which can be plotted with moonchart.ParamscanTearsheet, but can also return a PDF tear sheet.

Notes

Usage Guide:

Examples

Run a parameter scan for several different moving averages on a strategy called trend-friend and return a PDF:

quantrocket moonshot paramscan trend-friend -p MAVG_WINDOW -v 20 50 100 --pdf -o tearsheet.pdf

Run a 2-D parameter scan for multiple strategies and return a PDF:

quantrocket moonshot paramscan strat1 strat2 strat3 -p MIN_STD -v 1 1.5 2 --param2 STD_WINDOW --vals2 20 50 100 200 --pdf -o tearsheet.pdf

Run a parameter scan in 1-year segments to reduce memory usage:

quantrocket moonshot paramscan big-strategy -s 2000-01-01 -e 2018-01-01 --segment A -p MAVG_WINDOW -v 20 50 100 --pdf -o tearsheet.pdf

ml-walkforward

run a walk-forward optimization of a machine learning strategy

quantrocket moonshot ml-walkforward [-h] -s YYYY-MM-DD -e YYYY-MM-DD -t FREQ
                                    [-m FREQ] [-r FREQ] [-f MODEL_FILEPATH]
                                    [--force-nonincremental] [-g FREQ]
                                    [-l FLOAT] [-n [CURRENCY:NLV ...]]
                                    [-p [PARAM:VALUE ...]] [--no-cache] [-d]
                                    [--progress] [-o FILEPATH]
                                    CODE

Positional Arguments

CODE

the strategy code

walk-forward analysis options

-s, --start-date

the analysis start date (note that model training will start on this date but backtesting will not start until after the initial training period)

-e, --end-date

the analysis end date

-t, --train

train model this frequently (use Pandas frequency string, e.g. ‘A’ for annual training or ‘Q’ for quarterly training)

-m, --min-train

don’t backtest until at least this much model training has occurred; defaults to the length of –train if not specified (use Pandas frequency string, e.g. ‘5Y’ for 5 years of initial training)

-r, --rolling-train

train model with a rolling window of this length; if omitted, train model with an expanding window (use Pandas frequency string, e.g. ‘3Y’ for a 3-year rolling training window)

-f, --model

filepath of serialized model to use, filename must end in ‘.joblib’ or ‘.pkl’ (if omitted, default model is scikit-learn’s StandardScaler+SGDRegressor)

--force-nonincremental

force the model to be trained non-incrementally (i.e. load entire training data set into memory) even if it supports incremental learning. Required in order to perform a rolling (as opposed to expanding) walk-forward optimization with a model that supports incremental learning.

Default: False

backtest options

-g, --segment

train and backtest in date segments of this size, to reduce memory usage; must be smaller than –train/–min-train or will have no effect (use Pandas frequency string, e.g. ‘A’ for annual segments or ‘Q’ for quarterly segments)

-l, --allocation

the allocation for the strategy (default 1.0)

-n, --nlv

the NLV (net liquidation value, i.e. account balance) to assume for the backtest, expressed in each currency represented in the backtest (pass as ‘currency:nlv’)

-p, --params

one or more strategy params to set on the fly before backtesting (pass as ‘param:value’)

--no-cache

don’t use cached files even if available. Using cached files speeds up backtests but may be undesirable if underlying data has changed. See http://qrok.it/h/mcache to learn more about caching in Moonshot.

Default: False

output options

-d, --details

return detailed results for all securities instead of aggregating

Default: False

--progress

log status and Sharpe ratios of each walk-forward segment during analysis (default False)

Default: False

-o, --outfile

the location to write the ZIP file to; or, if path ends with ‘*’, the pattern to use for extracting the zipped files. For example, if the path is my_ml*, files will extracted to my_ml_results.csv and my_ml_trained_model.joblib.

Run a walk-forward optimization of a machine learning strategy.

The date range will be split into segments of –train size. For each segment, the model will be trained with the data, then the trained model will be backtested on the following segment.

By default, uses scikit-learn’s StandardScaler+SGDRegressor. Also supports other scikit-learn models/pipelines and Keras models. To customize model, instantiate the model locally, serialize it to disk, and pass the filename of the serialized model as –model.

Supports expanding walk-forward optimizations (the default), which use an anchored start date for model training, or rolling walk-forward optimizations (by specifying –rolling-train), which use a rolling or non-anchored start date for model training.

Returns a backtest results CSV and a dump of the machine learning model as of the end of the analysis.

Notes

Usage Guide:

Examples

Run a walk-forward optimization using the default model and retrain the model annually, writing the backtest results and trained model to demo_ml_results.csv and demo_ml_trained_model.joblib, respectively:

quantrocket moonshot ml-walkforward demo-ml -s 2007-01-01 -e 2018-12-31 --train A -o demo_ml*

Run a walk-forward optimization using a custom model (serialized with joblib), retrain the model annually, don’t perform backtesting until after 5 years of initial training, and further split the training and backtesting into quarterly segments to reduce memory usage:

quantrocket moonshot ml-walkforward demo-ml -s 2007-01-01 -e 2018-12-31 --model my_model.joblib --train A --min-train 5Y --segment Q -o demo_ml*

trade

run one or more strategies and generate orders.

quantrocket moonshot trade [-h] [-a [ACCOUNT ...]] [-r YYYY-MM-DD] [-j]
                           [-o FILEPATH]
                           CODE [CODE ...]

Positional Arguments

CODE

one or more strategy codes

Named Arguments

-a, --accounts

limit to these accounts

-r, --review-date

generate orders as if it were this date, rather than using today’s date

-j, --json

format orders as JSON (default is CSV)

-o, --outfile

the location to write the orders file (omit to write to stdout)

Run one or more strategies and generate orders.

Allocations are read from configuration (quantrocket.moonshot.allocations.yml).

Notes

Usage Guide:

Examples

Generate orders for a single strategy called umd-nyse:

quantrocket moonshot trade umd-nyse -o orders.csv

Generate orders and automatically place them (if any) through the blotter:

quantrocket moonshot trade umd-nyse | quantrocket blotter order -f -

Generate orders for multiple strategies for a particular account:

quantrocket moonshot trade umd-japan hml-japan --accounts DU12345 -o orders.csv

Generate orders as if it were an earlier date (for purpose of review):

quantrocket moonshot trade umd-nyse -o orders.csv --review-date 2018-05-11
quantrocket.moonshot.backtest(strategies, start_date=None, end_date=None, segment=None, allocations=None, nlv=None, params=None, details=None, output='csv', filepath_or_buffer=None, no_cache=False)

Backtest one or more strategies.

By default returns a CSV of backtest results but can also return a PDF tear sheet of performance charts.

If testing multiple strategies, each column in the CSV represents a strategy. If testing a single strategy and details=True, each column in the CSV represents a security in the strategy universe.

Parameters:
  • strategies (list of str, required) – one or more strategy codes

  • start_date (str (YYYY-MM-DD), optional) – the backtest start date (default is to use all available history)

  • end_date (str (YYYY-MM-DD), optional) – the backtest end date (default is to use all available history)

  • segment (str, optional) – backtest in date segments of this size, to reduce memory usage (use Pandas frequency string, e.g. ‘A’ for annual segments or ‘Q’ for quarterly segments)

  • allocations (dict of CODE:FLOAT, optional) – the allocation for each strategy, passed as {code:allocation} (default allocation is 1.0 / number of strategies)

  • nlv (dict of CURRENCY:NLV, optional) – the NLV (net liquidation value, i.e. account balance) to assume for the backtest, expressed in each currency represented in the backtest (pass as {currency:nlv})

  • params (dict of PARAM:VALUE, optional) – one or more strategy params to set on the fly before backtesting (pass as {param:value})

  • details (bool) – return detailed results for all securities instead of aggregating to strategy level (only supported for single-strategy backtests)

  • output (str, required) – the output format (choices are csv or pdf)

  • filepath_or_buffer (str or file-like object, optional) – the location to write the results file (omit to write to stdout)

  • no_cache (bool) – don’t use cached files even if available. Using cached files speeds up backtests but may be undesirable if underlying data has changed. See http://qrok.it/h/mcache to learn more about caching in Moonshot.

Return type:

None

See also

read_moonshot_csv

load a Moonshot backtest CSV into a DataFrame

Notes

Usage Guide:

Examples

Backtest several HML (High Minus Low) strategies from 2005-2015 and return a CSV of results:

>>> backtest(["hml-us", "hml-eur", "hml-asia"],
             start_date="2005-01-01",
             end_date="2015-12-31",
             filepath_or_buffer="hml_results.csv")

Run a backtest in 1-year segments to reduce memory usage:

>>> backtest("big-strategy",
             start_date="2000-01-01",
             end_date="2018-01-01",
             segment="A",
             filepath_or_buffer="results.csv")
quantrocket.moonshot.read_moonshot_csv(filepath_or_buffer)

Load a Moonshot backtest CSV into a DataFrame.

This is a light wrapper around pd.read_csv that handles setting index columns and casting to proper data types.

Parameters:

filepath_or_buffer (string or file-like, required) – path to CSV

Returns:

a multi-index (Field, Date[, Time]) DataFrame of backtest results, with sids or strategy codes as columns

Return type:

DataFrame

Notes

Usage Guide:

Examples

>>> results = read_moonshot_csv("moonshot_backtest.csv")
>>> returns = results.loc["Return"]
quantrocket.moonshot.scan_parameters(strategies, start_date=None, end_date=None, segment=None, param1=None, vals1=None, param2=None, vals2=None, allocations=None, nlv=None, params=None, num_workers=None, output='csv', filepath_or_buffer=None, no_cache=False)

Run a parameter scan for one or more strategies.

By default, returns a CSV of scan results which can be plotted with moonchart.ParamscanTearsheet, but can also return a PDF tear sheet.

Parameters:
  • strategies (list of str, required) – one or more strategy codes

  • start_date (str (YYYY-MM-DD), optional) – the backtest start date (default is to use all available history)

  • end_date (str (YYYY-MM-DD), optional) – the backtest end date (default is to use all available history)

  • segment (str, optional) – backtest in date segments of this size, to reduce memory usage (use Pandas frequency string, e.g. ‘A’ for annual segments or ‘Q’ for quarterly segments)

  • param1 (str, required) – the name of the parameter to test (a class attribute on the strategy)

  • vals1 (list of int/float/str/tuple, required) – parameter values to test (values can be ints, floats, strings, False, True, None, ‘default’ (to test current param value), or lists of ints/floats/strings)

  • param2 (str, optional) – name of a second parameter to test (for 2-D parameter scans)

  • vals2 (list of int/float/str/tuple, optional) – values to test for parameter 2 (values can be ints, floats, strings, False, True, None, ‘default’ (to test current param value), or lists of ints/floats/strings)

  • allocations (dict of CODE:FLOAT, optional) – the allocation for each strategy, passed as {code:allocation} (default allocation is 1.0 / number of strategies)

  • nlv (dict of CURRENCY:NLV, optional) – the NLV (net liquidation value, i.e. account balance) to assume for the backtest, expressed in each currency represented in the backtest (pass as {currency:nlv})

  • params (dict of PARAM:VALUE, optional) – one or more strategy params to set on the fly before running the parameter scan (pass as {param:value})

  • num_workers (int, optional) – the number of parallel workers to run. Running in parallel can speed up the parameter scan if your system has adequate resources. Default is 1, meaning no parallel processing.

  • output (str, required) – the output format (choices are csv or pdf)

  • filepath_or_buffer (str, optional) – the location to write the results file (omit to write to stdout)

  • no_cache (bool) – don’t use cached files even if available. Using cached files speeds up backtests but may be undesirable if underlying data has changed. See http://qrok.it/h/mcache to learn more about caching in Moonshot.

Return type:

None

Notes

Usage Guide:

Examples

Run a parameter scan for several different moving averages on a strategy called trend-friend, then view a tear sheet of the results:

>>> from moonchart import ParamscanTearsheet
>>> scan_parameters("trend-friend",
                    param1="MAVG_WINDOW",
                    vals1=[20, 50, 100],
                    filepath_or_buffer="trend_friend_MAVG_WINDOW.csv")
>>> ParamscanTearsheet.from_csv("trend_friend_MAVG_WINDOW.csv")

Run a 2-D parameter scan for multiple strategies and return a CSV:

>>> scan_parameters(["strat1", "strat2", "strat3"],
                    param1="MIN_STD",
                    vals1=[1, 1.5, 2],
                    param2="STD_WINDOW",
                    vals2=[20, 50, 100, 200],
                    filepath_or_buffer="strategies_MIN_STD_and_STD_WINDOW.csv")
>>> ParamscanTearsheet.from_csv("strategies_MIN_STD_and_STD_WINDOW.csv")

Run a parameter scan in 1-year segments to reduce memory usage:

>>> scan_parameters("big-strategy",
                    start_date="2000-01-01",
                    end_date="2018-01-01",
                    segment="A",
                    param1="MAVG_WINDOW",
                    vals1=[20, 50, 100],
                    filepath_or_buffer="big_strategy_MAVG_WINDOW.csv")
quantrocket.moonshot.ml_walkforward(strategy, start_date, end_date, train, min_train=None, rolling_train=None, model_filepath=None, force_nonincremental=None, segment=None, allocation=None, nlv=None, params=None, details=None, progress=False, filepath_or_buffer=None, no_cache=False)

Run a walk-forward optimization of a machine learning strategy.

The date range will be split into segments of train size. For each segment, the model will be trained with the data, then the trained model will be backtested on the following segment.

By default, uses scikit-learn’s StandardScaler+SGDRegressor. Also supports other scikit-learn models/pipelines and Keras models. To customize model, instantiate the model locally, serialize it to disk, and pass the path of the serialized model as model_filepath.

Supports expanding walk-forward optimizations (the default), which use an anchored start date for model training, or rolling walk-forward optimizations (by specifying rolling_train), which use a rolling or non-anchored start date for model training.

Returns a backtest results CSV and a dump of the machine learning model as of the end of the analysis.

Parameters:
  • strategy (str, required) – the strategy code

  • start_date (str (YYYY-MM-DD), required) – the analysis start date (note that model training will start on this date but backtesting will not start until after the initial training period)

  • end_date (str (YYYY-MM-DD), required) – the analysis end date

  • train (str, required) – train model this frequently (use Pandas frequency string, e.g. ‘A’ for annual training or ‘Q’ for quarterly training)

  • min_train (str, optional) – don’t backtest until at least this much model training has occurred; defaults to the length of train if not specified (use Pandas frequency string, e.g. ‘5Y’ for 5 years of initial training)

  • rolling_train (str, optional) – train model with a rolling window of this length; if omitted, train model with an expanding window (use Pandas frequency string, e.g. ‘3Y’ for a 3-year rolling training window)

  • model_filepath (str, optional) – filepath of serialized model to use, filename must end in “.joblib” or “.pkl” (if omitted, default model is scikit-learn’s StandardScaler+SGDRegressor)

  • force_nonincremental (bool, optional) – force the model to be trained non-incrementally (i.e. load entire training data set into memory) even if it supports incremental learning. Must be True in order to perform a rolling (as opposed to expanding) walk-forward optimization with a model that supports incremental learning. Default False.

  • segment (str, optional) – train and backtest in date segments of this size, to reduce memory usage; must be smaller than train/min_train or will have no effect (use Pandas frequency string, e.g. ‘A’ for annual segments or ‘Q’ for quarterly segments)

  • allocation (float, optional) – the allocation for the strategy (default 1.0)

  • nlv (dict of CURRENCY:NLV, optional) – the NLV (net liquidation value, i.e. account balance) to assume for the backtest, expressed in each currency represented in the backtest (pass as {currency:nlv})

  • params (dict of PARAM:VALUE, optional) – one or more strategy params to set on the fly before backtesting (pass as {param:value})

  • details (bool) – return detailed results for all securities instead of aggregating

  • progress (bool) – log status and Sharpe ratios of each walk-forward segment during analysis (default False)

  • filepath_or_buffer (str, optional) – the location to write the ZIP file to; or, if path ends with “*”, the pattern to use for extracting the zipped files. For example, if the path is my_ml*, files will extracted to my_ml_results.csv and my_ml_trained_model.joblib.

  • no_cache (bool) – don’t use cached files even if available. Using cached files speeds up backtests but may be undesirable if underlying data has changed. See http://qrok.it/h/mcache to learn more about caching in Moonshot.

Return type:

None

Notes

Usage Guide:

Examples

Run a walk-forward optimization using the default model and retrain the model annually, writing the backtest results and trained model to demo_ml_results.csv and demo_ml_trained_model.joblib, respectively:

>>> ml_walkforward(
        "demo-ml",
        "2007-01-01",
        "2018-12-31",
        train="A",
        filepath_or_buffer="demo_ml*")

Create a scikit-learn model, serialize it with joblib, and use it to run the walkforward backtest:

>>> from sklearn.linear_model import SGDClassifier
>>> import joblib
>>> clf = SGDClassifier()
>>> joblib.dump(clf, "my_model.joblib")
>>> ml_walkforward(
        "demo-ml",
        "2007-01-01",
        "2018-12-31",
        train="A",
        model_filepath="my_model.joblib",
        filepath_or_buffer="demo_ml*")

Run a walk-forward optimization using a custom model (serialized with joblib), retrain the model annually, don’t perform backtesting until after 5 years of initial training, and further split the training and backtesting into quarterly segments to reduce memory usage:

>>> ml_walkforward(
        "demo-ml",
        "2007-01-01",
        "2018-12-31",
        model_filepath="my_model.joblib",
        train="A",
        min_train="5Y",
        segment="Q",
        filepath_or_buffer="demo_ml*")

Create a Keras model, serialize it, and use it to run the walkforward backtest:

>>> from keras.models import Sequential
>>> from keras.layers import Dense
>>> model = Sequential()
>>> # input_dim should match number of features in training data
>>> model.add(Dense(units=4, activation='relu', input_dim=5))
>>> # last layer should have a single unit
>>> model.add(Dense(units=1, activation='softmax'))
>>> model.compile(loss='sparse_categorical_crossentropy',
                  optimizer='sgd',
                  metrics=['accuracy'])
>>> model.save('my_model.keras.h5')
>>> ml_walkforward(
        "neuralnet-ml",
        "2007-01-01",
        "2018-12-31",
        train="A",
        model_filepath="my_model.keras.h5",
        filepath_or_buffer="neuralnet_ml*")
quantrocket.moonshot.trade(strategies, accounts=None, review_date=None, output='csv', filepath_or_buffer=None)

Run one or more strategies and generate orders.

Allocations are read from configuration (quantrocket.moonshot.allocations.yml).

Parameters:
  • strategies (list of str, required) – one or more strategy codes

  • accounts (list of str, optional) – limit to these accounts

  • review_date (str (YYYY-MM-DD), optional) – generate orders as if it were this date, rather than using today’s date

  • output (str, required) – the output format (choices are csv or json)

  • filepath_or_buffer (str, optional) – the location to write the orders file (omit to write to stdout)

Return type:

None

Notes

Usage Guide:

 

Moonshot API

Resource Group

Backtests

Run Backtest
POST/moonshot/backtests.{output}{?strategies,start_date,end_date,segment,allocations,nlv,params,details,no_cache}

Backtest one or more strategies and return a CSV of backtest results or a PDF tear sheet of performance charts.

If testing multiple strategies, each column in the CSV represents a strategy. If testing a single strategy and details=True, each column in the CSV represents a security in the strategy universe.

Example URI

POST http://houston/moonshot/backtests.csv?strategies=umd-nyse&start_date=2015-02-01&end_date=2017-06-06&segment=A&allocations=umd-nyse:0.25&nlv=USD:500000&params=BENCHMARK:None&details=true&no_cache=false
URI Parameters
strategies
str (required) Example: umd-nyse

the strategy code to test (pass multiple times for multiple strategies)

start_date
str (optional) Example: 2015-02-01

the backtest start date (default is to use all available history)

end_date
str (optional) Example: 2017-06-06

the backtest end date (default is to use all available history)

segment
str (optional) Example: A

backtest in date segments of this size, to reduce memory usage (use Pandas frequency string, e.g. ‘A’ for annual segments or ‘Q’ for quarterly segments)

allocations
str (optional) Example: umd-nyse:0.25

the allocation for each strategy, passed as ‘code:allocation’ (default allocation is 1.0 / number of strategies) (pass multiple times for multiple strategies)

nlv
str (optional) Example: USD:500000

the NLV (net liquidation value, i.e. account balance) to assume for the backtest, expressed in each currency represented in the backtest (pass as ‘currency:nlv’) (pass multiple times for multiple currencies)

params
str (optional) Example: BENCHMARK:None

one or more strategy params to set on the fly before backtesting (pass as ‘param:value’) (pass multiple times for multiple params)

details
bool (optional) Example: true

return detailed results for all securities instead of aggregating to strategy level (only supported for single-strategy backtests)

output
str (required) Example: csv

the output format (choices are csv or pdf)

no_cache
bool (required) Example: false

don’t use cached files even if available. Using cached files speeds up backtests but may be undesirable if underlying data has changed.

Response  200
Headers
Content-Type: text/csv
Response  200
Headers
Content-Type: application/pdf

Parameter Scans

Run Parameter Scan
POST/moonshot/paramscans.{output}{?strategies,start_date,end_date,segment,param1,vals1,param2,vals2,allocations,nlv,params,no_cache,num_workers}

Run a parameter scan for one or more strategies. By default, returns a CSV of scan results which can be plotted with moonchart.ParamscanTearsheet, but can also return a PDF tear sheet.

Example URI

POST http://houston/moonshot/paramscans.csv?strategies=umd-nyse&start_date=2015-02-01&end_date=2017-06-06&segment=A&param1=SMAVG_WINDOW&vals1=20&param2=LMAVG_WINDOW&vals2=180&allocations=umd-nyse:0.25&nlv=USD:500000&params=BENCHMARK:None&no_cache=false&num_workers=2
URI Parameters
strategies
str (required) Example: umd-nyse

the strategy code to test (pass multiple times for multiple strategies)

start_date
str (optional) Example: 2015-02-01

the backtest start date (default is to use all available history)

end_date
str (optional) Example: 2017-06-06

the backtest end date (default is to use all available history)

segment
str (optional) Example: A

backtest in date segments of this size, to reduce memory usage (use Pandas frequency string, e.g. ‘A’ for annual segments or ‘Q’ for quarterly segments)

param1
str (required) Example: SMAVG_WINDOW

the name of the parameter to test (a class attribute on the strategy)

vals1
str (required) Example: 20

parameter values to test (values can be ints, floats, strings, False, True, None, ‘default’ (to test current param value), or lists of ints/floats/strings) (pass multiple times for multiple values)

param2
str (optional) Example: LMAVG_WINDOW

name of a second parameter to test (for 2-D parameter scans)

vals2
str (optional) Example: 180

values to test for parameter 2 (values can be ints, floats, strings, False, True, None, ‘default’ (to test current param value), or lists of ints/floats/strings) (pass multiple times for multiple values)

allocations
str (optional) Example: umd-nyse:0.25

the allocation for each strategy, passed as ‘code:allocation’ (default allocation is 1.0 / number of strategies) (pass multiple times for multiple strategies)

nlv
str (optional) Example: USD:500000

the NLV (net liquidation value, i.e. account balance) to assume for the backtest, expressed in each currency represented in the backtest (pass as ‘currency:nlv’) (pass multiple times for multiple currencies)

params
str (optional) Example: BENCHMARK:None

one or more strategy params to set on the fly before backtesting (pass as ‘param:value’) (pass multiple times for multiple params)

num_workers
int (optional) Example: 2

the number of parallel workers to run. Running in parallel can speed up the parameter scan if your system has adequate resources. Default is 1, meaning no parallel processing.

output
str (required) Example: csv

the output format (choices are csv or pdf)

no_cache
bool (required) Example: false

don’t use cached files even if available. Using cached files speeds up backtests but may be undesirable if underlying data has changed.

Response  200
Headers
Content-Type: text/csv
Response  200
Headers
Content-Type: application/pdf

Walk-forward Optimizations

Run Walk-forward Optimization
POST/moonshot/ml/walkforward/{code}.zip{?start_date,end_date,train,min_train,rolling_train,force_nonincremental,segment,allocation,nlv,params,details,progress,no_cache}

Run a walk-forward optimization of a machine learning strategy.

The date range will be split into segments of train size. For each segment, the model will be trained with the data, then the trained model will be backtested on the following segment.

By default, uses scikit-learn’s StandardScaler+SGDRegressor. Also supports other scikit-learn models/pipelines and Keras models. To customize model, instantiate the model locally, serialize it to disk, and upload the serialized model.

Supports expanding walk-forward optimizations (the default), which use an anchored start date for model training, or rolling walk-forward optimizations (by specifying rolling_train), which use a rolling or non-anchored start date for model training.

Returns a backtest results CSV and a dump of the machine learning model as of the end of the analysis.

Example URI

POST http://houston/moonshot/ml/walkforward/demo-ml.zip?start_date=2015-02-01&end_date=2017-06-06&train=Y&min_train=5Y&rolling_train=3Y&force_nonincremental=false&segment=A&allocation=1.0&nlv=USD:500000&params=BENCHMARK:None&details=true&progress=true&no_cache=false
URI Parameters
code
str (required) Example: demo-ml

the strategy code

start_date
str (required) Example: 2015-02-01

the analysis start date (note that model training will start on this date but backtesting will not start until after the initial training period)

end_date
str (required) Example: 2017-06-06

the analysis end date

train
str (required) Example: Y

train model this frequently (use Pandas frequency string, e.g. ‘A’ for annual training or ‘Q’ for quarterly training)

min_train
str (optional) Example: 5Y

don’t backtest until at least this much model training has occurred; defaults to the length of train if not specified (use Pandas frequency string, e.g. ‘5Y’ for 5 years of initial training)

rolling_train
str (optional) Example: 3Y

train model with a rolling window of this length; if omitted, train model with an expanding window (use Pandas frequency string, e.g. ‘3Y’ for a 3-year rolling training window)

force_nonincremental
bool (optional) Example: false

force the model to be trained non-incrementally (i.e. load entire training data set into memory) even if it supports incremental learning. Must be true in order to perform a rolling (as opposed to expanding) walk-forward optimization with a model that supports incremental learning. Default false.

segment
str (optional) Example: A

train and backtest in date segments of this size, to reduce memory usage; must be smaller than train/min_train or will have no effect (use Pandas frequency string, e.g. ‘A’ for annual segments or ‘Q’ for quarterly segments)

allocation
float (optional) Example: 1.0

the allocation for the strategy (default 1.0)

nlv
str (optional) Example: USD:500000

the NLV (net liquidation value, i.e. account balance) to assume for the backtest, expressed in each currency represented in the backtest (pass as ‘currency:nlv’) (pass multiple times for multiple currencies)

params
str (optional) Example: BENCHMARK:None

one or more strategy params to set on the fly before backtesting (pass as ‘param:value’) (pass multiple times for multiple params)

details
bool (optional) Example: true

return detailed results for all securities instead of aggregating

progress
bool (optional) Example: true

log status and Sharpe ratios of each walk-forward segment during analysis (default false)

no_cache
bool (required) Example: false

don’t use cached files even if available. Using cached files speeds up backtests but may be undesirable if underlying data has changed.

Response  200
Headers
Content-Type: application/zip

Orders

Generate Orders
GET/moonshot/orders.{output}{?strategies,accounts,review_date}

Run one or more strategies and generate orders. Allocations are read from configuration (quantrocket.moonshot.allocations.yml).

Example URI

GET http://houston/moonshot/orders.csv?strategies=umd-nyse&accounts=U12345&review_date=2018-05-18
URI Parameters
strategies
str (required) Example: umd-nyse

the strategy code to run (pass multiple times for multiple strategies)

accounts
str (optional) Example: U12345

limit to these accounts (pass multiple times for multiple accounts)

review_date
str (optional) Example: 2018-05-18

generate orders as if it were this date, rather than using today’s date

output
str (required) Example: csv

the output format (choices are csv or json)

Response  200
Headers
Content-Type: text/csv
Response  200
Headers
Content-Type: application/json

quantrocket.realtime

real-time market data service

QuantRocket real-time market data CLI

usage: quantrocket realtime [-h]
                            {create-ibkr-tick-db,create-polygon-tick-db,create-alpaca-tick-db,create-agg-db,config,drop-db,drop-ticks,list,collect,active,cancel,get,stream}
                            ...

subcommands

subcommand

Possible choices: create-ibkr-tick-db, create-polygon-tick-db, create-alpaca-tick-db, create-agg-db, config, drop-db, drop-ticks, list, collect, active, cancel, get, stream

Sub-commands

create-ibkr-tick-db

create a new database for collecting real-time tick data from Interactive Brokers

quantrocket realtime create-ibkr-tick-db [-h] [-u [UNIVERSE ...]]
                                         [-i [SID ...]] [-f [FIELD ...]] [-p]
                                         CODE

Positional Arguments

CODE

the code to assign to the database (lowercase alphanumerics and hyphens only)

Named Arguments

-u, --universes

include these universes

-i, --sids

include these sids

-f, --fields

collect these fields (pass ‘?’ or any invalid fieldname to see available fields, default fields are ‘LastPrice’ and ‘Volume’)

-p, --primary-exchange

limit to data from the primary exchange

Default: False

Create a new database for collecting real-time tick data from Interactive Brokers.

The market data requirements you specify when you create a new database are applied each time you collect data for that database.

Notes

Usage Guide:

Examples

Create a database for collecting real-time trades and volume for US stocks:

quantrocket realtime create-ibkr-tick-db usa-stk-trades -u usa-stk --fields LastPrice Volume

Create a database for collecting trades and quotes for a universe of futures:

quantrocket realtime create-ibkr-tick-db cme-fut-taq -u cme-fut --fields LastPrice Volume BidPrice AskPrice BidSize AskSize

create-polygon-tick-db

create a new database for collecting real-time tick data from Polygon

quantrocket realtime create-polygon-tick-db [-h] [-u [UNIVERSE ...]]
                                            [-i [SID ...]] [-f [FIELD ...]]
                                            CODE

Positional Arguments

CODE

the code to assign to the database (lowercase alphanumerics and hyphens only)

Named Arguments

-u, --universes

include these universes

-i, --sids

include these sids

-f, --fields

collect these fields (pass ‘?’ or any invalid fieldname to see available fields, default fields are ‘LastPrice’ and ‘LastSize’)

Create a new database for collecting real-time tick data from Polygon.

The market data requirements you specify when you create a new database are applied each time you collect data for that database.

Notes

Usage Guide:

Examples

Create a database for collecting real-time trade prices and sizes for US stocks:

quantrocket realtime create-polygon-tick-db usa-stk-trades -u usa-stk --fields LastPrice LastSize

create-alpaca-tick-db

create a new database for collecting real-time tick data from Alpaca

quantrocket realtime create-alpaca-tick-db [-h] [-u [UNIVERSE ...]]
                                           [-i [SID ...]] [-f [FIELD ...]]
                                           CODE

Positional Arguments

CODE

the code to assign to the database (lowercase alphanumerics and hyphens only)

Named Arguments

-u, --universes

include these universes

-i, --sids

include these sids

-f, --fields

collect these fields (pass ‘?’ or any invalid fieldname to see available fields, default fields are ‘LastPrice’ and ‘LastSize’)

Create a new database for collecting real-time tick data from Alpaca.

The market data requirements you specify when you create a new database are applied each time you collect data for that database.

Notes

Usage Guide:

Examples

Create a database for collecting real-time trade prices and sizes for US stocks:

quantrocket realtime create-alpaca-tick-db usa-stk-trades -u usa-stk --fields LastPrice LastSize

create-agg-db

create an aggregate database from a tick database

quantrocket realtime create-agg-db [-h] -t CODE -z BAR_SIZE [-f [FIELD ...]]
                                   CODE

Positional Arguments

CODE

the code to assign to the aggregate database (lowercase alphanumerics and hyphens only)

Named Arguments

-t, --tick-db

the code of the tick database to aggregate

-z, --bar-size

the time frequency to aggregate to (use a Pandas timedelta string, for example 10s or 1m or 2h or 1d)

-f, --fields

include these fields in aggregate database, aggregated in these ways. Specify as a list of strings mapping tick db fields to a comma-separated list of aggregate functions to apply to the field. Format strings as ‘FIELD:FUNC1,FUNC2’. Available aggregate functions are ‘Close’, ‘Open’, ‘High’, ‘Low’, ‘Mean’, ‘Sum’, and ‘Count’. See examples. If not specified, defaults to including the ‘Close’ for each tick db field.

Create an aggregate database from a tick database.

Aggregate databases provide rolled-up views of the underlying tick data, aggregated to a desired frequency (such as 1-minute bars).

Notes

Usage Guide:

Examples

Create an aggregate database of 1 minute bars consisting of OHLC trades and volume, from a tick database of US stocks, resulting in fields called LastPriceOpen, LastPriceHigh, LastPriceLow, LastPriceClose, and VolumeClose:

quantrocket realtime create-agg-db usa-stk-trades-1min --tick-db usa-stk-trades -z 1m -f LastPrice:Open,High,Low,Close Volume:Close

Create an aggregate database of 1 second bars containing the closing bid and ask and the mean bid size and ask size, from a tick database of futures trades and quotes, resulting in fields called BidPriceClose, AskPriceClose, BidSizeMean, and AskSizeMean:

quantrocket realtime create-agg-db cme-fut-taq-1sec --tick-db cme-fut-taq -z 1s -f BidPrice:Close AskPrice:Close BidSize:Mean AskSize:Mean

config

return the configuration for a tick database or aggregate database

quantrocket realtime config [-h] code

Positional Arguments

code

the tick database code or aggregate database code

Return the configuration for a tick database or aggregate database.

Notes

Usage Guide:

Examples

Return the configuration for a tick database called “cme-fut-taq”:

quantrocket realtime config cme-fut-taq

Return the configuration for an aggregate database called “cme-fut-taq-1s”:

quantrocket realtime config cme-fut-taq-1s

drop-db

delete a tick database or aggregate database

quantrocket realtime drop-db [-h] --confirm-by-typing-db-code-again CODE
                             [--cascade]
                             code

Positional Arguments

code

the tick database code or aggregate database code

Named Arguments

--confirm-by-typing-db-code-again

enter the db code again to confirm you want to drop the database, its config, and all its data

--cascade

also delete associated aggregated databases, if any. Only applicable when deleting a tick database.

Default: False

Delete a tick database or aggregate database.

Deleting a tick database deletes its configuration and data and any associated aggregate databases. Deleting an aggregate database does not delete the tick database from which it is derived.

Deleting databases is irreversible.

Notes

Usage Guide:

Examples

Delete a database called “usa-stk-trades”:

quantrocket realtime drop-db usa-stk-trades --confirm-by-typing-db-code-again usa-stk-trades

drop-ticks

delete ticks from a tick database

quantrocket realtime drop-ticks [-h] -o TIMEDELTA code

Positional Arguments

code

the tick database code

Named Arguments

-o, --older-than

delete ticks older than this (use a Pandas timedelta string, for example 7d)

Delete ticks from a tick database. Does not delete any aggregate database records.

Deleting ticks is a way to free up disk space by deleting ticks older than a certain threshold while maintaining the ability to continue collecting new ticks as well as use any aggregate databases derived from the ticks.

Note: ticks are stored in the database in chunks, and this command only deletes chunks in which all of the ticks are older than you specify. If some of the ticks are older but some are newer, the chunk is not deleted. This means you may still see older data returned in queries.

Notes

Usage Guide:

Examples

Delete ticks older than 7 days in a database called ‘usa-tech-stk-tick’ (no aggregate records are deleted):

quantrocket realtime drop-ticks usa-tech-stk-tick --older-than 7d

list

list tick databases and associated aggregate databases

quantrocket realtime list [-h]

List tick databases and associated aggregate databases.

Notes

Usage Guide:

Examples

quantrocket realtime list

collect

collect real-time market data and save it to a tick database

quantrocket realtime collect [-h] [-i [SID ...]] [-u [UNIVERSE ...]]
                             [-f [FIELD ...]] [--until TIME_OR_TIMEDELTA] [-s]
                             [-w]
                             CODE [CODE ...]

Positional Arguments

CODE

the tick database code(s) to collect data for

Named Arguments

-i, --sids

collect market data for these sids, overriding db config (typically used to collect a subset of securities)

-u, --universes

collect market data for these universes, overriding db config (typically used to collect a subset of securities)

-f, --fields

limit to these fields, overriding db config

--until

schedule data collection to end at this time. Can be a datetime (YYYY-MM-DD HH:MM:SS), a time (HH:MM:SS), or a Pandas timedelta string (e.g. 2h or 30min). If not provided, market data is collected until cancelled.

-s, --snapshot

collect a snapshot of market data (default is to collect a continuous stream of market data)

Default: False

-w, --wait

wait for market data snapshot to complete before returning (default is to return immediately). Requires –snapshot

Default: False

Collect real-time market data and save it to a tick database.

A single snapshot of market data or a continuous stream of market data can be collected, depending on the –snapshot parameter. (Snapshots are not supported for all vendors.)

Streaming real-time data is collected until cancelled, or can be scheduled for cancellation using the –until parameter.

Notes

Usage Guide:

Examples

Collect market data for all securities in a tick database called ‘japan-banks-trades’:

quantrocket realtime collect japan-banks-trades

Collect market data for a subset of securities in a tick database called ‘usa-stk-trades’ and automatically cancel the data collection in 30 minutes:

quantrocket realtime collect usa-stk-trades --sids FIBBG12345 FIBBG23456 FIBBG34567 --until 30m

Collect a market data snapshot and wait until it completes:

quantrocket realtime collect usa-stk-trades --snapshot --wait

active

return the number of tickers currently being collected, by vendor and database

quantrocket realtime active [-h] [-d]

Named Arguments

-d, --detail

return lists of tickers (default is to return counts of tickers)

Default: False

Return the number of tickers currently being collected, by vendor and database.

Notes

Usage Guide:

Examples

quantrocket realtime active

cancel

cancel market data collection

quantrocket realtime cancel [-h] [-i [SID ...]] [-u [UNIVERSE ...]] [-a]
                            [CODE ...]

Positional Arguments

CODE

the tick database code(s) to cancel collection for

Named Arguments

-i, --sids

cancel market data for these sids, overriding db config

-u, --universes

cancel market data for these universes, overriding db config

-a, --all

cancel all market data collection

Default: False

Cancel market data collection.

Notes

Usage Guide:

Examples

Cancel market data collection for a tick database called ‘cme-fut-taq’:

quantrocket realtime cancel cme-fut-taq

Cancel all market data collection:

quantrocket realtime cancel --all

get

query market data from a tick database or aggregate database and download to file

quantrocket realtime get [-h] [-s YYYY-MM-DD] [-e YYYY-MM-DD]
                         [-u [UNIVERSE ...]] [-i [SID ...]]
                         [--exclude-universes [UNIVERSE ...]]
                         [--exclude-sids [SID ...]] [-o OUTFILE] [-j]
                         [-f [FIELD ...]]
                         CODE

Positional Arguments

CODE

the code of the tick database or aggregate database to query

filtering options

-s, --start-date

limit to market data on or after this datetime. Can pass a date (YYYY-MM-DD), datetime with optional timezone (YYYY-MM-DD HH:MM:SS TZ), or time with optional timezone. A time without date will be interpreted as referring to today if the time is earlier than now, or yesterday if the time is later than now.

-e, --end-date

limit to market data on or before this datetime. Can pass a date (YYYY-MM-DD), datetime with optional timezone (YYYY-MM-DD HH:MM:SS TZ), or time with optional timezone.

-u, --universes

limit to these universes

-i, --sids

limit to these sids

--exclude-universes

exclude these universes

--exclude-sids

exclude these sids

output options

-o, --outfile

filename to write the data to (default is stdout)

-j, --json

format output as JSON (default is CSV)

-f, --fields

only return these fields (pass ‘?’ or any invalid fieldname to see available fields)

Query market data from a tick database or aggregate database and download to file.

Notes

Usage Guide:

Examples

Download a CSV of futures market data since 08:00 AM Chicago time:

quantrocket realtime get cme-fut-taq --start-date '08:00:00 America/Chicago' -o cme_taq.csv

stream

stream incoming market data

quantrocket realtime stream [-h] [-i [SID ...]] [--exclude-sids [SID ...]]
                            [-f [FIELD ...]]

Named Arguments

-i, --sids

limit to these sids

--exclude-sids

exclude these sids

-f, --fields

limit to these fields

Stream incoming market data.

This command does not cause data to be collected but connects to the stream of data already being collected.

Notes

Usage Guide:

Examples

Stream all incoming market data:

quantrocket realtime stream

Stream a subset of fields and sids:

quantrocket realtime stream --sids FIBBG265598 --fields BidPrice AskPrice
quantrocket.realtime.create_ibkr_tick_db(code, universes=None, sids=None, fields=None, primary_exchange=False)

Create a new database for collecting real-time tick data from Interactive Brokers.

The market data requirements you specify when you create a new database are applied each time you collect data for that database.

Parameters:
  • code (str, required) – the code to assign to the database (lowercase alphanumerics and hyphens only)

  • universes (list of str) – include these universes

  • sids (list of str) – include these sids

  • fields (list of str) – collect these fields (pass ‘?’ or any invalid fieldname to see available fields, default fields are ‘LastPrice’ and ‘Volume’)

  • primary_exchange (bool) – limit to data from the primary exchange (default False)

Returns:

status message

Return type:

dict

Notes

Usage Guide:

Examples

Create a database for collecting real-time trades and volume for US stocks:

>>> create_ibkr_tick_db("usa-stk-trades", universes="usa-stk",
                        fields=["LastPrice", "Volume"])

Create a database for collecting trades and quotes for a universe of futures:

>>> create_ibkr_tick_db("cme-fut-taq", universes="cme-fut",
                        fields=["LastPrice", "Volume", "BidPrice", "AskPrice", "BidSize", "AskSize"])
quantrocket.realtime.create_polygon_tick_db(code, universes=None, sids=None, fields=None)

Create a new database for collecting real-time tick data from Polygon.

The market data requirements you specify when you create a new database are applied each time you collect data for that database.

Parameters:
  • code (str, required) – the code to assign to the database (lowercase alphanumerics and hyphens only)

  • universes (list of str) – include these universes

  • sids (list of str) – include these sids

  • fields (list of str) – collect these fields (pass ‘?’ or any invalid fieldname to see available fields, default fields are ‘LastPrice’ and ‘LastSize’)

Returns:

status message

Return type:

dict

Notes

Usage Guide:

Examples

Create a database for collecting real-time trade prices and sizes for US stocks:

>>> create_polygon_tick_db("usa-stk-trades", universes="usa-stk", fields=["LastPrice", "LastSize"])
quantrocket.realtime.create_alpaca_tick_db(code, universes=None, sids=None, fields=None)

Create a new database for collecting real-time tick data from Alpaca.

The market data requirements you specify when you create a new database are applied each time you collect data for that database.

Parameters:
  • code (str, required) – the code to assign to the database (lowercase alphanumerics and hyphens only)

  • universes (list of str) – include these universes

  • sids (list of str) – include these sids

  • fields (list of str) – collect these fields (pass ‘?’ or any invalid fieldname to see available fields, default fields are ‘LastPrice’ and ‘LastSize’)

Returns:

status message

Return type:

dict

Notes

Usage Guide:

Examples

Create a database for collecting real-time trade prices and sizes for US stocks:

>>> create_alpaca_tick_db("usa-stk-trades", universes="usa-stk", fields=["LastPrice", "LastSize"])
quantrocket.realtime.create_agg_db(code, tick_db_code, bar_size, fields=None)

Create an aggregate database from a tick database.

Aggregate databases provide rolled-up views of the underlying tick data, aggregated to a desired frequency (such as 1-minute bars).

Parameters:
  • code (str, required) – the code to assign to the aggregate database (lowercase alphanumerics and hyphens only)

  • tick_db_code (str, required) – the code of the tick database to aggregate

  • bar_size (str, required) – the time frequency to aggregate to (use a Pandas timedelta string, for example 10s or 1m or 2h or 1d)

  • fields (dict of list of str, optional) – include these fields in aggregate database, aggregated in these ways. Provide a dict mapping tick db fields to lists of aggregate functions to apply to the field. Available aggregate functions are “Close”, “Open”, “High”, “Low”, “Mean”, “Sum”, and “Count”. See examples section. If not specified, defaults to including the “Close” for each tick db field.

Returns:

status message

Return type:

dict

Notes

Usage Guide:

Examples

Create an aggregate database of 1 minute bars consisting of OHLC trades and volume, from a tick database of US stocks, resulting in fields called LastPriceOpen, LastPriceHigh, LastPriceLow, LastPriceClose, and VolumeClose:

>>> create_agg_db("usa-stk-trades-1min", tick_db_code="usa-stk-trades",
                  bar_size="1m",
                  fields={"LastPrice":["Open","High","Low","Close"],
                          "Volume": ["Close"]})

Create an aggregate database of 1 second bars containing the closing bid and ask and the mean bid size and ask size, from a tick database of futures trades and quotes, resulting in fields called BidPriceClose, AskPriceClose, BidSizeMean, and AskSizeMean:

>>> create_agg_db("cme-fut-taq-1sec", tick_db_code="cme-fut-taq",
                  bar_size="1s",
                  fields={"BidPrice":["Close"],
                          "AskPrice": ["Close"],
                          "BidSize": ["Mean"],
                          "AskSize": ["Mean"]
                          })
quantrocket.realtime.get_db_config(code)

Return the configuration for a tick database or aggregate database.

Parameters:

code (str, required) – the tick database code or aggregate database code

Returns:

config

Return type:

dict

Notes

Usage Guide:

quantrocket.realtime.drop_db(code, confirm_by_typing_db_code_again=None, cascade=False)

Delete a tick database or aggregate database.

Deleting a tick database deletes its configuration and data and any associated aggregate databases. Deleting an aggregate database does not delete the tick database from which it is derived.

Deleting databases is irreversible.

Parameters:
  • code (str, required) – the tick database code or aggregate database code

  • confirm_by_typing_db_code_again (str, required) – enter the db code again to confirm you want to drop the database, its config, and all its data

  • cascade (bool) – also delete associated aggregated databases, if any. Only applicable when deleting a tick database.

Returns:

status message

Return type:

dict

See also

drop_ticks

Delete ticks from a tick database.

Notes

Usage Guide:

quantrocket.realtime.drop_ticks(code, older_than=None)

Delete ticks from a tick database. Does not delete any aggregate database records.

Deleting ticks is a way to free up disk space by deleting ticks older than a certain threshold while maintaining the ability to continue collecting new ticks as well as use any aggregate databases derived from the ticks.

Note: ticks are stored in the database in chunks, and this function only deletes chunks in which all of the ticks are older than you specify. If some of the ticks are older but some are newer, the chunk is not deleted. This means you may still see older data returned in queries.

Parameters:
  • code (str, required) – the tick database code

  • older_than (str, required) –

    delete ticks older than this (use a Pandas timedelta string, for example

    7d)

Returns:

status message

Return type:

dict

See also

drop_db

Delete a tick database or aggregate database.

Notes

Usage Guide:

Examples

Delete ticks older than 7 days in a database called ‘usa-tech-stk-tick’ (no aggregate records are deleted):

>>> drop_ticks("usa-tech-stk-tick", older_than="7d")
quantrocket.realtime.list_databases()

List tick databases and associated aggregate databases.

Returns:

dict of {tick_db: [agg_dbs]}

Return type:

dict

Notes

Usage Guide:

quantrocket.realtime.collect_market_data(codes, sids=None, universes=None, fields=None, until=None, snapshot=False, wait=False)

Collect real-time market data and save it to a tick database.

A single snapshot of market data or a continuous stream of market data can be collected, depending on the snapshot parameter. (Snapshots are not supported for all vendors.)

Streaming real-time data is collected until cancelled, or can be scheduled for cancellation using the until parameter.

Parameters:
  • codes (list of str, required) – the tick database code(s) to collect data for

  • sids (list of str, optional) – collect market data for these sids, overriding db config (typically used to collect a subset of securities)

  • universes (list of str, optional) – collect market data for these universes, overriding db config (typically used to collect a subset of securities)

  • fields (list of str, optional) – limit to these fields, overriding db config

  • until (str, optional) – schedule data collection to end at this time. Can be a datetime (YYYY-MM-DD HH:MM:SS), a time (HH:MM:SS), or a Pandas timedelta string (e.g. 2h or 30min). If not provided, market data is collected until cancelled.

  • snapshot (bool) – collect a snapshot of market data (default is to collect a continuous stream of market data)

  • wait (bool) – wait for market data snapshot to complete before returning (default is to return immediately). Requires ‘snapshot=True’

Returns:

status message

Return type:

dict

Notes

Usage Guide:

Examples

Collect market data for all securities in a tick database called ‘japan-banks-trades’:

>>> collect_market_data("japan-banks-trades")

Collect market data for a subset of securities in a tick database called ‘usa-stk-trades’ and automatically cancel the data collection in 30 minutes:

>>> collect_market_data("usa-stk-trades",
                        sids=["FIBBG12345", "FIBBG23456", "FIBBG34567"],
                        until="30m")

Collect a market data snapshot and wait until it completes:

>>> collect_market_data("usa-stk-trades", snapshot=True, wait=True)
quantrocket.realtime.get_active_collections(detail=False)

Return the number of tickers currently being collected, by vendor and database.

Parameters:

detail (bool) – return lists of tickers (default is to return counts of tickers)

Returns:

subscribed tickers by vendor and database

Return type:

dict

Notes

Usage Guide:

quantrocket.realtime.cancel_market_data(codes=None, sids=None, universes=None, cancel_all=False)

Cancel market data collection.

Parameters:
  • codes (list of str, optional) – the tick database code(s) to cancel collection for

  • sids (list of str, optional) – cancel market data for these sids, overriding db config

  • universes (list of str, optional) – cancel market data for these universes, overriding db config

  • cancel_all (bool) – cancel all market data collection

Returns:

subscribed tickers by vendor and database, after cancellation

Return type:

dict

Notes

Usage Guide:

Examples

Cancel market data collection for a tick database called ‘cme-fut-taq’:

>>> cancel_market_data("cme-fut-taq")

Cancel all market data collection:

>>> cancel_market_data(cancel_all=True)
quantrocket.realtime.download_market_data_file(code, filepath_or_buffer=None, output='csv', start_date=None, end_date=None, universes=None, sids=None, exclude_universes=None, exclude_sids=None, fields=None)

Query market data from a tick database or aggregate database and download to file.

Parameters:
  • code (str, required) – the code of the tick database or aggregate database to query

  • filepath_or_buffer (str or file-like object) – filepath to write the data to, or file-like object (defaults to stdout)

  • output (str) – output format (json, csv, default is csv)

  • start_date (str (YYYY-MM-DD HH:MM:SS), optional) – limit to market data on or after this datetime. Can pass a date (YYYY-MM-DD), datetime with optional timezone (YYYY-MM-DD HH:MM:SS TZ), or time with optional timezone. A time without date will be interpreted as referring to today if the time is earlier than now, or yesterday if the time is later than now.

  • end_date (str (YYYY-MM-DD HH:MM:SS), optional) – limit to market data on or before this datetime. Can pass a date (YYYY-MM-DD), datetime with optional timezone (YYYY-MM-DD HH:MM:SS TZ), or time with optional timezone.

  • universes (list of str, optional) – limit to these universes (default is to return all securities in database)

  • sids (list of str, optional) – limit to these sids

  • exclude_universes (list of str, optional) – exclude these universes

  • exclude_sids (list of str, optional) – exclude these sids

  • fields (list of str, optional) – only return these fields (pass ‘?’ or any invalid fieldname to see available fields)

Return type:

None

Notes

Usage Guide:

Examples

Download a CSV of futures market data since 08:00 AM Chicago time:

>>> download_market_data_file("cme-fut-taq",
                             start_date="08:00:00 America/Chicago",
                             filepath_or_buffer="cme_taq.csv")
>>> market_data = pd.read_csv("cme_taq.csv", parse_dates=["Date"])

See also

quantrocket.get_prices

load prices into a DataFrame

 

Real-Time Data API

Resource Group

Tick Database

Create Tick Database
PUT/realtime/databases/{code}{?universes,sids,vendor,fields,primary_exchange}

Create a new database for collecting real-time tick data.

The market data requirements you specify when you create a new database are applied each time you collect data for that database.

Example URI

PUT http://houston/realtime/databases/cme-fut-taq?universes=cme-fut&sids=FI12345&vendor=ibkr&fields=Last&primary_exchange=true
URI Parameters
code
str (required) Example: cme-fut-taq

the code to assign to the database (lowercase alphanumerics and hyphens only)

universes
str (optional) Example: cme-fut

include these universes (pass multiple times for multiple universes)

sids
str (optional) Example: FI12345

include these sids (pass multiple times for multiple sids)

vendor
str (optional) Example: ibkr

the vendor to collect data from

Choices: alpaca ibkr polygon

fields
str (optional) Example: Last

collect these fields (pass multiple times for multiple fields) (pass ‘?’ or any invalid fieldname to see available fields, default fields are ‘Last’ and ‘Volume’)

primary_exchange
bool (required) Example: true

limit to data from the primary exchange (default False)

Response  200
Headers
Content-Type: application/json
Body
{
  "status": "status: successfully created tick database cme-fut-taq"
}

Get Tick Database Config
GET/realtime/databases/{code}

Return the configuration for a tick database.

Example URI

GET http://houston/realtime/databases/cme-fut-taq
URI Parameters
code
str (required) Example: cme-fut-taq

the tick database code

Response  200
Headers
Content-Type: application/json
Body
{
  "universes": [
    "cme-fut"
  ],
  "vendor": "ibkr",
  "fields": [
    "Last",
    "Volume",
    "Bid",
    "Ask",
    "BidSize",
    "AskSize"
  ]
}

Delete Tick Database
DELETE/realtime/databases/{code}{?confirm_by_typing_db_code_again,cascade}

Delete a tick database.

Deleting a tick database deletes its configuration and data and any associated aggregate databases.

Example URI

DELETE http://houston/realtime/databases/cme-fut-taq?confirm_by_typing_db_code_again=cme-fut-taq&cascade=true
URI Parameters
code
str (required) Example: cme-fut-taq

the tick database code

confirm_by_typing_db_code_again
str (required) Example: cme-fut-taq

enter the db code again to confirm you want to drop the database, its config, and all its data

cascade
bool (required) Example: true

also delete associated aggregated databases, if any.

Response  200
Headers
Content-Type: application/json
Body
{
  "status": "deleted tick database cme-fut-taq"
}

Ticks

Delete Ticks
DELETE/realtime/ticks/{code}{?older_than,cascade}

Delete ticks from a tick database.

Deleting ticks is a way to free up disk space by deleting ticks older than a certain threshold while maintaining the ability to continue collecting new ticks as well as use any aggregate databases derived from the ticks.

Note: ticks are stored in the database in chunks, and this function only deletes chunks in which all of the ticks are older than you specify. If some of the ticks are older but some are newer, the chunk is not deleted. This means you may still see older data returned in queries.

Note: when using cascade=True to also delete records from aggregate databases, the only aggregate records that will be deleted are ones corresponding to the ticks being deleted at the same time. This can have unintuitive consequences if you sometimes use cascade=True and sometimes don’t. For example, if you delete ticks older than 1 hour without cascade, then repeat the function with cascade=True, no aggregate records will be deleted. This is because the tick records older than 1 hour were already deleted previously and thus there can be no cascading delete of aggregate records on the subsequent call.

Example URI

DELETE http://houston/realtime/ticks/cme-fut-taq?older_than=7d&cascade=true
URI Parameters
code
str (required) Example: cme-fut-taq

the tick database code

older_than
str (required) Example: 7d

delete ticks older than this (use a Pandas timedelta string, for example 7d)

cascade
bool (required) Example: true

also delete records that are older than older_than from this tick database’s aggregate database(s), if any. By default, does not delete any aggregate database records.

Response  200
Headers
Content-Type: application/json
Body
{
  "status": "dropped ticks older than 7d from database cme-fut-taq"
}

Aggregate Database

Create Aggregate Database
PUT/realtime/databases/{tick_db_code}/aggregates/{code}{?bar_size,fields}

Create an aggregate database from a tick database.

Aggregate databases provide rolled-up views of the underlying tick data, aggregated to a desired frequency (such as 1-minute bars).

Example URI

PUT http://houston/realtime/databases/cme-fut-taq/aggregates/cme-fut-taq-1s?bar_size=1s&fields=Last:Close,Open
URI Parameters
code
str (required) Example: cme-fut-taq-1s

the code to assign to the aggregate database (lowercase alphanumerics and hyphens only)

tick_db_code
str (required) Example: cme-fut-taq

the code of the tick database to aggregate

bar_size
str (required) Example: 1s

the time frequency to aggregate to (use a Pandas timedelta string, for example 10s or 1m or 2h or 1d)

fields
str (optional) Example: Last:Close,Open

include these fields in aggregate database, aggregated in these ways. Specify as a list of strings mapping tick db fields to a comma-separated list of aggregate functions to apply to the field. Format strings as ‘FIELD:FUNC1,FUNC2’. Available aggregate functions are ‘Close’, ‘Open’, ‘High’, ‘Low’, ‘Mean’, ‘Sum’, and ‘Count’. If not specified, defaults to including the ‘Close’ for each tick db field (pass multiple times for multiple fields)

Response  200
Headers
Content-Type: application/json
Body
{
  "status": "successfully created aggregate database cme-fut-taq-1sec from tick database cme-fut-taq"
}

Get Aggregate Database Config
GET/realtime/databases/{tick_db_code}/aggregates/{code}

Return the configuration for an aggregate database.

Example URI

GET http://houston/realtime/databases/cme-fut-taq/aggregates/cme-fut-taq-1s
URI Parameters
code
str (required) Example: cme-fut-taq-1s

the aggregate database code

tick_db_code
str (required) Example: cme-fut-taq

the tick database code

Response  200
Headers
Content-Type: application/json
Body
{
  "tick_db_code": "cme-fut-taq",
  "bar_size": "1s",
  "fields": [
    "AskClose",
    "AskSizeMean",
    "BidClose",
    "BidSizeMean"
  ]
}

Delete Aggregate Database
DELETE/realtime/databases/{tick_db_code}/aggregates/{code}{?confirm_by_typing_db_code_again}

Delete an aggregate database.

Deleting an aggregate database does not delete the tick database from which it is derived.

Example URI

DELETE http://houston/realtime/databases/cme-fut-taq/aggregates/cme-fut-taq-1s?confirm_by_typing_db_code_again=cme-fut-taq-1s
URI Parameters
code
str (required) Example: cme-fut-taq-1s

the aggregate database code

tick_db_code
str (required) Example: cme-fut-taq

the tick database code

confirm_by_typing_db_code_again
str (required) Example: cme-fut-taq-1s

enter the aggregate db code again to confirm you want to drop the database, its config, and all its data

Response  200
Headers
Content-Type: application/json
Body
{
  "status": "deleted aggregate database cme-fut-taq-1sec"
}

Databases

List Real-Time Databases
GET/realtime/databases

List tick databases and associated aggregate databases.

Example URI

GET http://houston/realtime/databases
Response  200
Headers
Content-Type: application/json
Body
{
  "cme-fut-taq": [
    "cme-fut-taq-1s"
  ],
  "usa-liquid-taq": [
    "usa-liquid-taq-1h"
  ]
}

Market Data Collection

Collect Market Data
POST/realtime/collections{?codes,sids,universes,fields,until,snapshot,wait}

Collect real-time market data and save it to a tick database.

A single snapshot of market data or a continuous stream of market data can be collected, depending on the snapshot parameter. (Snapshots are not supported for all vendors.)

Streaming real-time data is collected until cancelled, or can be scheduled for cancellation using the until parameter.

Example URI

POST http://houston/realtime/collections?codes=cme-fut-taq&sids=FI12345&universes=cme-fut&fields=Last&until=30m&snapshot=false&wait=false
URI Parameters
codes
str (required) Example: cme-fut-taq

the tick database code(s) to collect data for (pass multiple times for multiple codes)

sids
str (optional) Example: FI12345

collect market data for these sids, overriding db config (typically used to collect a subset of securities) (pass multiple times for multiple sids)

universes
str (optional) Example: cme-fut

collect market data for these universes, overriding config (typically used to collect a subset of securities) (pass multiple times for multiple universes)

fields
str (optional) Example: Last

limit to these fields, overriding db config (pass multiple times for multiple fields)

until
str (optional) Example: 30m

schedule data collection to end at this time. Can be a datetime (YYYY-MM-DD HH:MM:SS), a time (HH:MM:SS), or a Pandas timedelta string (e.g. 2h or 30min). If not provided, market data is collected until cancelled.

snapshot
bool (required) Example: false

collect a snapshot of market data (default is to collect a continuous stream of market data)

wait
bool (required) Example: false

wait for market data snapshot to complete before returning (default is to return immediately). Requires ‘snapshot=true’

Response  202
Headers
Content-Type: application/json
Body
{
  "status": "the market data will be collected asynchronously"
}

Get Active Market Data Collections
GET/realtime/collections{?detail}

Return the number of tickers currently being collected, by vendor and database.

Example URI

GET http://houston/realtime/collections?detail=false
URI Parameters
detail
bool (required) Example: false

return lists of tickers (default is to return counts of tickers)

Response  200
Headers
Content-Type: application/json
Body
{
  "alpaca": {
    "sample-stk-tick": 4
  },
  "ibkr": {
    "cme-fut-taq": 17
  },
  "polygon": {
    "us-stk-tick": 53
  }
}

Cancel Market Data Collection
DELETE/realtime/collections{?codes,sids,universes,cancel_all}

Cancel market data collection.

Example URI

DELETE http://houston/realtime/collections?codes=cme-fut-taq&sids=FI12345&universes=cme-fut&cancel_all=true
URI Parameters
codes
str (required) Example: cme-fut-taq

the tick database code(s) to cancel collection for (pass multiple times for multiple codes)

sids
str (optional) Example: FI12345

cancel market data for these sids, overriding db config (pass multiple times for multiple sids)

universes
str (optional) Example: cme-fut

cancel market data for these universes, overriding config (pass multiple times for multiple universes)

cancel_all
bool (required) Example: true

cancel all market data collection

Response  200
Headers
Content-Type: application/json
Body
{}

Market Data

Query Market Data
GET/realtime/{code}.{filetype}{?start_date,end_date,universes,sids,exclude_universes,exclude_sids,fields}

Query market data from a tick database or aggregate database and download to file.

Example URI

GET http://houston/realtime/cme-fut-taq.csv?start_date=2016-06-01&end_date=2017-06-01&universes=es-fut&sids=FI12345&exclude_universes=other-universe&exclude_sids=FI23456&fields=LastPriceClose
URI Parameters
code
str (required) Example: cme-fut-taq

the code of the tick database or aggregate database to query

filetype
str (required) Example: csv

output format

Choices: csv json

start_date
str (optional) Example: 2016-06-01

limit to market data on or after this datetime. Can pass a date (YYYY-MM-DD), datetime with optional timezone (YYYY-MM-DD HH:MM:SS TZ), or time with optional timezone. A time without date will be interpreted as referring to today if the time is earlier than now, or yesterday if the time is later than now.

end_date
str (optional) Example: 2017-06-01

limit to market data on or before this datetime. Can pass a date (YYYY-MM-DD), datetime with optional timezone (YYYY-MM-DD HH:MM:SS TZ), or time with optional timezone.

universes
str (optional) Example: es-fut

limit to these universes (default is to return all securities in database) (pass multiple times for multiple universes)

sids
str (optional) Example: FI12345

limit to these sids (pass multiple times for multiple sids)

exclude_universes
str (optional) Example: other-universe

exclude these universes (pass multiple times for multiple universes)

exclude_sids
str (optional) Example: FI23456

exclude these sids (pass multiple times for multiple sids)

fields
str (optional) Example: LastPriceClose

only return these fields (pass ‘?’ or any invalid fieldname to see available fields) (pass multiple times for multiple fields)

Response  200
Headers
Content-Type: text/csv
Body
Sid,Date,Last
FI756733,2019-05-28 19:23:16.850314+00,281.34
FI756733,2019-05-28 19:23:17.188008+00,281.38
FI756733,2019-05-28 19:23:17.939423+00,281.37
FI756733,2019-05-28 19:23:19.195198+00,281.39
FI756733,2019-05-28 19:23:20.695623+00,281.41

Streaming Market Data

Stream Market Data
GET/realtime/stream{?sids,exclude_sids,fields}

Stream real-time data over WebSockets.

Example URI

GET http://houston/realtime/stream?sids=FI12345&exclude_sids=FI23456&fields=BidSize
URI Parameters
sids
str (optional) Example: FI12345

limit to these sids (pass multiple times for multiple sids)

exclude_sids
str (optional) Example: FI23456

exclude these sids (pass multiple times for multiple sids)

fields
str (optional) Example: BidSize

limit to these fields (pass multiple times for multiple fields)

quantrocket.satellite

satellite service

QuantRocket Satellite CLI

usage: quantrocket satellite [-h] {exec} ...

subcommands

subcommand

Possible choices: exec

Sub-commands

exec

execute a Python function or abitrary shell command on a satellite service

quantrocket satellite exec [-h] [-r FILEPATH] [-o FILEPATH]
                           [-p [PARAM:VALUE ...]] [-s SERVICE_NAME]
                           CMD

Positional Arguments

CMD

the shell command to run, or the Python function in dot notation (must start with ‘codeload.’ to be interpreted as a Python function)

Named Arguments

-r, --return-file

the path of a file to be returned after the command completes

-o, --outfile

the location to write the return_file (omit to write to stdout)

-p, --params

one or more params to pass to the Python function (pass as {param:value})

-s, --service

the service name (default ‘satellite’)

Default: “satellite”

Execute a Python function or abitrary shell command on a satellite service.

Notes

Usage Guide:

Examples

Run a Python function called ‘create_calendar_spread’ defined in ‘/codeload/scripts/combos.py’ and pass it arguments:

quantrocket satellite exec 'codeload.scripts.combos.create_calendar_spread' --params 'universe:cl-fut' 'contract_months:[1,2]'

Run a backtrader backtest and save the performance chart to file:

quantrocket satellite exec 'python /codeload/backtrader/dual_moving_average.py' --return-file '/tmp/backtrader-plot.pdf' --outfile 'backtrader-plot.pdf'
quantrocket.satellite.execute_command(cmd, return_file, filepath_or_buffer, params=None, service='satellite')

Execute a Python function or arbitrary shell command on a satellite service.

Parameters:
  • cmd (str, required) – the shell command to run, or the Python function in dot notation (must start with “codeload.” to be interpreted as a Python function).

  • return_file (str, optional) – the path of a file to be returned after the command completes

  • filepath_or_buffer (str, optional) – the location to write the return_file (omit to write to stdout)

  • params (dict of PARAM:VALUE, optional) – one or more params to pass to the Python function (pass as {param:value})

  • service (str, optional) – the service name (default ‘satellite’)

Returns:

None if return_file, otherwise status message. If cmd uses Python dot notation and the Python function returns a value, it will be included in the status message as the “output” key. Return values must be JSON-serializable.

Return type:

dict or None

Notes

Usage Guide:

Examples

Run a Python function called ‘create_calendar_spread’ defined in ‘/codeload/scripts/combos.py’ and pass it arguments:

>>> execute_command("codeload.scripts.combos.create_calendar_spread",
                    params={"universe":"cl-fut", "contract_months":[1,2]})

Run a Python function called ‘calculate_signal’ defined in ‘/codeload/scripts/custom.py’ and retrieve the return value:

>>> response = execute_command("codeload.scripts.custom.calculate_signal")
>>> if response["status"] == "success":
        print(response["output"])

Run a backtrader backtest and save the performance chart to file:

>>> execute_command("python /codeload/backtrader/dual_moving_average.py",
                    return_file="/tmp/backtrader-plot.pdf"
                    outfile="backtrader-plot.pdf")
 

Satellite API

Resource Group

Commands

Execute Command
POST/{service}/commands{?cmd,return_file,params}

Execute a Python function or abitrary shell command on a satellite service.

Example URI

POST http://houston/satellite/commands?cmd=python%20%2Fcodeload%2Fbacktrader%2Fdual_moving_average.py&return_file=%2Ftmp%2Fbacktrader-plot.pdf&params=myarg:myval
URI Parameters
service
str (required) Example: satellite

the service name

cmd
str (required) Example: python%20%2Fcodeload%2Fbacktrader%2Fdual_moving_average.py

the shell command to run, or the Python function in dot notation (must start with “codeload.” to be interpreted as a Python function).

return_file
str (optional) Example: %2Ftmp%2Fbacktrader-plot.pdf

the path of a file to be returned after the command completes

params
str (optional) Example: myarg:myval

one or more params to pass to the Python function (pass as param:value)

Response  200
Headers
Content-Type: application/octet-stream

quantrocket.utils

Utility functions

quantrocket.version

version number

show the QuantRocket version number

usage: quantrocket version [-h] [-d]

Named Arguments

-d, --detail

show the services version number and also the version number of the client library making this API call. Default is to only show the services version number, which is the main QuantRocket version number.

Default: False

Show the QuantRocket version number.

Examples:

Show the version number:

quantrocket version

Show both the services and client version numbers:

quantrocket version -d

quantrocket.version.get_version(detail=False)

Show the QuantRocket version number.

Parameters:

detail (bool) – if True, show the services version number and also the version number of the client library making this API call. Default is to only show the services version number, which is the main QuantRocket version number.

Returns:

services version number, or dict of services and client version numbers

Return type:

str or dict

 

Version API

Resource Group

Version

Version
GET/version

Show the QuantRocket version number.

Example URI

GET http://houston/version
Response  200
Headers
Content-Type: application/json
Body
{
  "services": "2.8.0"
}

quantrocket.zipline

This API is for backtesting and live trading of Zipline strategies, as well as managing data bundles. For writing Zipline strategies, see the zipline API.

QuantRocket CLI for Zipline

usage: quantrocket zipline [-h]
                           {create-usstock-bundle,create-sharadar-bundle,create-bundle-from-db,ingest,list-bundles,config,drop-bundle,default-bundle,get,backtest,paramscan,tearsheet,trade,active,cancel}
                           ...

subcommands

subcommand

Possible choices: create-usstock-bundle, create-sharadar-bundle, create-bundle-from-db, ingest, list-bundles, config, drop-bundle, default-bundle, get, backtest, paramscan, tearsheet, trade, active, cancel

Sub-commands

create-usstock-bundle

create a Zipline bundle for US stocks

quantrocket zipline create-usstock-bundle [-h] [-i SID] [-u UNIVERSE] [--free]
                                          [-d {daily,d,minute,m}]
                                          CODE

Positional Arguments

CODE

the code to assign to the bundle (lowercase alphanumerics and hyphens only)

Named Arguments

-i, --sids

limit to these sids (only supported for minute data bundles)

-u, --universes

limit to these universes (only supported for minute data bundles)

--free

limit to free sample data

Default: False

-d, --data-frequency

Possible choices: daily, d, minute, m

whether to collect minute data (which also includes daily data) or only daily data. Default is minute data. Possible choices: [‘daily’, ‘d’, ‘minute’, ‘m’]

Create a Zipline bundle for US stocks.

This command defines the bundle parameters but does not ingest the actual data. To ingest the data, see quantrocket zipline ingest.

Notes

Usage Guide:

Examples

Create a minute data bundle for all US stocks:

quantrocket zipline create-usstock-bundle usstock-1min

Create a bundle for daily data only:

quantrocket zipline create-usstock-bundle usstock-1d --data-frequency daily

Create a minute data bundle based on a universe:

quantrocket zipline create-usstock-bundle usstock-tech-1min --universes us-tech

Create a minute data bundle of free sample data:

quantrocket zipline create-usstock-bundle usstock-free-1min --free

create-sharadar-bundle

create a Zipline bundle of daily data for Sharadar stocks and/or ETFs

quantrocket zipline create-sharadar-bundle [-h] [-t [SEC_TYPE ...]] [--free]
                                           CODE

Positional Arguments

CODE

the code to assign to the bundle (lowercase alphanumerics and hyphens only)

Named Arguments

-t, --sec-types

Possible choices: STK, ETF

limit to these security types. Possible choices: [‘STK’, ‘ETF’]. Default is to include both stocks and ETFs.

--free

limit to free sample data

Default: False

Create a Zipline bundle of daily data for Sharadar stocks and/or ETFs.

This command defines the bundle parameters but does not ingest the actual data. To ingest the data, see quantrocket zipline ingest.

Notes

Usage Guide:

Examples

Create a bundle for all Sharadar stocks and ETFs:

quantrocket zipline create-sharadar-bundle sharadar-1d

Create a bundle for ETFs only:

quantrocket zipline create-sharadar-bundle sharadar-etf-1d --sec-types ETF

Create a bundle of free sample data:

quantrocket zipline create-sharadar-bundle sharadar-free-1d --free

create-bundle-from-db

create a Zipline bundle from a history database or real-time aggregate database

quantrocket zipline create-bundle-from-db [-h] -d CODE [CODE ...] [-c NAME]
                                          [-f [ZIPLINE_FIELD:DB_FIELD ...]] -s
                                          YYYY-MM-DD [-e YYYY-MM-DD]
                                          [-u [UNIVERSE ...]] [-i [SID ...]]
                                          [--exclude-universes [UNIVERSE ...]]
                                          [--exclude-sids [SID ...]]
                                          CODE

Positional Arguments

CODE

the code to assign to the bundle (lowercase alphanumerics and hyphens only)

Named Arguments

-d, --from-db

the code(s) of one or more history databases or real-time aggregate databases to ingest. If multiple databases are specified, they must have the same bar size and same fields. If a security is present in multiple databases, the first database’s values will be used.

-c, --calendar

the name of the calendar to use with this bundle (provide ‘?’ or any invalid calendar name to see available choices)

-f, --fields

mapping of Zipline fields (open, high, low, close, volume) to db fields. Pass as ‘zipline_field:db_field’. Defaults to mapping Zipline ‘open’ to db ‘Open’, etc.

filtering options for db ingestion

-s, --start-date

limit to historical data on or after this date. This parameter is required and also determines the default start date for backtests and queries.

-e, --end-date

limit to historical data on or before this date

-u, --universes

limit to these universes

-i, --sids

limit to these sids

--exclude-universes

exclude these universes

--exclude-sids

exclude these sids

Create a Zipline bundle from a history database or real-time aggregate database.

You can ingest 1-minute or 1-day databases.

This command defines the bundle parameters but does not ingest the actual data. To ingest the data, see quantrocket zipline ingest.

Notes

Usage Guide:

Examples

Create a bundle from a history database called “es-fut-1min” and name it like the history database:

quantrocket zipline create-bundle-from-db es-fut-1min --from-db es-fut-1min --calendar us_futures --start-date 2015-01-01

Create a bundle named “usa-stk-1min-2017” for ingesting a single year of US 1-minute stock data from a history database called “usa-stk-1min”:

quantrocket zipline create-bundle-from-db usa-stk-1min-2017 --from-db usa-stk-1min -s 2017-01-01 -e 2017-12-31 --calendar XNYS

Create a bundle from a real-time aggregate database and specify how to map Zipline fields to the database fields:

quantrocket zipline create-bundle-from-db free-stk-1min --from-db free-stk-tick-1min --calendar XNYS --start-date 2020-06-01 --fields close:LastPriceClose open:LastPriceOpen high:LastPriceHigh low:LastPriceLow volume:VolumeClose

ingest

ingest data into a previously defined bundle

quantrocket zipline ingest [-h] [-i [SID ...]] [-u [UNIVERSE ...]] CODE

Positional Arguments

CODE

the bundle code

Named Arguments

-i, --sids

limit to these sids, overriding stored config

-u, --universes

limit to these universes, overriding stored config

Ingest data into a previously defined bundle.

Notes

Usage Guide:

Examples

Ingest data into a bundle called usstock-1min:

quantrocket zipline ingest usstock-1min

list-bundles

list available data bundles and whether data has been ingested into them

quantrocket zipline list-bundles [-h]

List available data bundles and whether data has been ingested into them.

Notes

Usage Guide:

Examples

quantrocket zipline list-bundles

config

return the configuration of a bundle

quantrocket zipline config [-h] CODE

Positional Arguments

CODE

the bundle code

Return the configuration of a bundle.

Notes

Usage Guide:

Examples

Return the configuration of a bundle called ‘usstock-1min’:

quantrocket zipline config usstock-1min

drop-bundle

delete a bundle

quantrocket zipline drop-bundle [-h] --confirm-by-typing-bundle-code-again
                                CODE
                                CODE

Positional Arguments

CODE

the bundle code

Named Arguments

--confirm-by-typing-bundle-code-again

enter the bundle code again to confirm you want to drop the bundle, its config, and all its data

Delete a bundle.

Notes

Usage Guide:

Examples

Delete a bundle called ‘es-fut-1min’:

quantrocket zipline drop-bundle es-fut-1min --confirm-by-typing-bundle-code-again es-fut-1min

default-bundle

set or show the default bundle to use for backtesting and trading

quantrocket zipline default-bundle [-h] [bundle]

Positional Arguments

bundle

the bundle code

Set or show the default bundle to use for backtesting and trading.

Setting a default bundle is a convenience and is optional. It can be overridden by manually specifying a bundle when backtesting or trading.

Notes

Usage Guide:

Examples

Set a bundle named usstock-1min as the default:

quantrocket zipline default-bundle usstock-1min

Show current default bundle:

quantrocket zipline default-bundle

get

query minute or daily data from a Zipline bundle and download to a CSV file

quantrocket zipline get [-h] [-s YYYY-MM-DD] [-e YYYY-MM-DD]
                        [-d {daily,d,minute,m}] [-u [UNIVERSE ...]]
                        [-i [SID ...]] [--exclude-universes [UNIVERSE ...]]
                        [--exclude-sids [SID ...]] [-t [HH:MM:SS ...]]
                        [-o OUTFILE] [-f [FIELD ...]]
                        CODE

Positional Arguments

CODE

the bundle code

filtering options

-s, --start-date

limit to history on or after this date

-e, --end-date

limit to history on or before this date

-d, --data-frequency

Possible choices: daily, d, minute, m

whether to query minute or daily data. If omitted, defaults to minute data for minute bundles and to daily data for daily bundles. This parameter only needs to be set to request daily data from a minute bundle. Possible choices: [‘daily’, ‘d’, ‘minute’, ‘m’]

-u, --universes

limit to these universes

-i, --sids

limit to these sids

--exclude-universes

exclude these universes

--exclude-sids

exclude these sids

-t, --times

limit to these times

output options

-o, --outfile

filename to write the data to (default is stdout)

-f, --fields

only return these fields (pass ‘?’ or any invalid fieldname to see available fields)

Query minute or daily data from a Zipline bundle and download to a CSV file.

Notes

Usage Guide:

Examples

Download a CSV of minute prices since 2015 for a single security from a bundle called “usstock-1min”:

quantrocket zipline get usstock-1min --start-date 2015-01-01 -i FIBBG12345 -o minute_prices.csv

backtest

backtest a Zipline strategy and write the test results to a CSV file

quantrocket zipline backtest [-h] [-f {daily,d,minute,m}]
                             [--capital-base FLOAT] [-b CODE] [-s YYYY-MM-DD]
                             [-e YYYY-MM-DD] [-p FREQ]
                             [--params [PARAM:VALUE ...]] [-o FILENAME]
                             CODE

Positional Arguments

CODE

the strategy to run (strategy filename without extension)

Named Arguments

-f, --data-frequency

Possible choices: daily, d, minute, m

the data frequency to use. Possible choices: [‘daily’, ‘d’, ‘minute’, ‘m’] (default is minute)

--capital-base

the starting capital for the simulation (default is 1e6 (1 million))

-b, --bundle

the data bundle to use for the simulation. If omitted, the default bundle (if set) is used.

-s, --start-date

the start date of the simulation (defaults to the bundle start date)

-e, --end-date

the end date of the simulation (defaults to today)

-p, --progress

log backtest progress at this interval (use a pandas offset alias, for example ‘D’ for daily, ‘W’ for weeky, ‘M’ for monthly, ‘A’ for annually)

--params

one or more strategy parameters (defined as module-level attributes in the algo file) to modify on the fly before backtesting (pass as ‘param:value’)

-o, --output

the location to write the output file (omit to write to stdout)

Backtest a Zipline strategy and write the test results to a CSV file.

The CSV result file contains several DataFrames stacked into one: the Zipline performance results, plus the extracted returns, transactions, positions, and benchmark returns from those results.

Notes

Usage Guide:

Examples

Run a backtest from a strategy file called etf-arb.py and save a CSV file of results, logging backtest progress at annual intervals:

quantrocket zipline backtest etf-arb --bundle arca-etf-eod -s 2010-04-01 -e 2016-02-01 -o results.csv --progress A

paramscan

run a parameter scan for a Zipline strategy

quantrocket zipline paramscan [-h] [-f {daily,d,minute,m}]
                              [--capital-base FLOAT] [-b CODE] [-s YYYY-MM-DD]
                              [-e YYYY-MM-DD] -p PARAM -v VALUE [VALUE ...]
                              [--param2 PARAM] [--vals2 [VALUE ...]]
                              [--params [PARAM:VALUE ...]] [-n INT]
                              [--progress FREQ] [-o FILENAME]
                              CODE

Positional Arguments

CODE

the strategy to run (strategy filename without extension)

Named Arguments

-f, --data-frequency

Possible choices: daily, d, minute, m

the data frequency to use. Possible choices: [‘daily’, ‘d’, ‘minute’, ‘m’] (default is minute)

--capital-base

the starting capital for the simulation (default is 1e6 (1 million))

-b, --bundle

the data bundle to use for the simulation. If omitted, the default bundle (if set) is used.

-s, --start-date

the start date of the simulation (defaults to the bundle start date)

-e, --end-date

the end date of the simulation (defaults to today)

-p, --param1

the name of the parameter to test (a module-level attribute in the algo file)

-v, --vals1

parameter values to test (values can be integers, floats, strings, ‘True’, ‘False’, ‘None’, or ‘default’ (to test current param value); for lists/tuples, use comma-separated values)

--param2

name of a second parameter to test (for 2-D parameter scans)

--vals2

values to test for parameter 2 (values can be integers, floats, strings, ‘True’, ‘False’, ‘None’, or ‘default’ (to test current param value); for lists/tuples, use comma-separated values)

--params

one or more strategy parameters (defined as module-level attributes in the algo file) to modify on the fly before running the parameter scan (pass as ‘param:value’)

-n, --num-workers

the number of parallel workers to run. Running in parallel can speed up the parameter scan if your system has adequate resources. Default is 1, meaning no parallel processing.

--progress

log backtest progress at this interval (use a pandas offset alias, for example ‘D’ for daily, ‘W’ for weeky, ‘M’ for monthly, ‘A’ for annually). This parameter controls logging in the underlying backtests; a summary of scan results will be logged regardless of this parameter. Using this parameter when –num-workers is greater than 1 will result in messy and interleaved log output and is not recommended.

-o, --output

the location to write the output file (omit to write to stdout)

Run a parameter scan for a Zipline strategy. The resulting CSV can be plotted with moonchart.ParamscanTearsheet.

Notes

Usage Guide:

Examples

Run a parameter scan for a moving average strategy called dma:

quantrocket zipline paramscan dma -b usstock-1min -f daily -s 2015-01-03 -e 2022-06-30 -p MAVG_WINDOW -v 20 50 100 -o dma_MAVG_WINDOW.csv

tearsheet

create a pyfolio tear sheet from a Zipline backtest result

quantrocket zipline tearsheet [-h] -o FILENAME [FILENAME]

Positional Arguments

FILENAME

the CSV file from a Zipline backtest (omit to read file from stdin)

Default: “-”

Named Arguments

-o, --output

the location to write the pyfolio tear sheet

Create a pyfolio PDF tear sheet from a Zipline backtest result.

Examples

Create a pyfolio tear sheet from a Zipline CSV results file:

quantrocket zipline tearsheet results.csv -o results.pdf

Run a Zipline backtest and create a pyfolio tear sheet without saving the CSV file:

quantrocket zipline backtest dma -s 2010-04-01 -e 2016-02-01 | quantrocket zipline tearsheet -o dma.pdf

trade

trade a Zipline strategy

quantrocket zipline trade [-h] [-b CODE] [-a ACCOUNT] [-f {daily,d,minute,m}]
                          [--dry-run]
                          CODE

Positional Arguments

CODE

the strategy to run (strategy filename without extension)

Named Arguments

-b, --bundle

the data bundle to use. If omitted, the default bundle (if set) is used.

-a, --account

the account to run the strategy in. Only required if the strategy is allocated to more than one account in quantrocket.zipline.allocations.yml

-f, --data-frequency

Possible choices: daily, d, minute, m

the data frequency to use. Possible choices: [‘daily’, ‘d’, ‘minute’, ‘m’] (default is minute)

--dry-run

write orders to file instead of sending them to the blotter. Orders will be written to /codeload/zipline/{strategy}.{account}.orders.{date}.csv. If omitted, orders are sent to the blotter and not written to file.

Default: False

Trade a Zipline strategy.

Notes

Usage Guide:

Examples

Trade a strategy defined in momentum-pipeline.py:

quantrocket zipline trade momentum-pipeline --bundle my-bundle

active

list actively trading Zipline strategies

quantrocket zipline active [-h]

List actively trading Zipline strategies.

Notes

Usage Guide:

Examples

List strategies:

quantrocket zipline active

cancel

cancel actively trading strategies

quantrocket zipline cancel [-h] [-s [CODE ...]] [-a [ACCOUNT ...]] [--all]

Named Arguments

-s, --strategies

limit to these strategies

-a, --accounts

limit to these accounts

--all

cancel all actively trading strategies

Default: False

Cancel actively trading strategies.

Notes

Usage Guide:

Examples

Cancel a single strategy:

quantrocket zipline cancel --strategies momentum-pipeline

Cancel all strategies:

quantrocket zipline cancel --all
quantrocket.zipline.create_usstock_bundle(code, sids=None, universes=None, free=False, data_frequency=None)

Create a Zipline bundle for US stocks.

This function defines the bundle parameters but does not ingest the actual data. To ingest the data, see ingest_bundle.

Parameters:
  • code (str, required) – the code to assign to the bundle (lowercase alphanumerics and hyphens only)

  • sids (list of str, optional) – limit to these sids (only supported for minute data bundles)

  • universes (list of str, optional) – limit to these universes (only supported for minute data bundles)

  • free (bool) – limit to free sample data

  • data_frequency (str, optional) – whether to collect minute data (which also includes daily data) or only daily data. Default is minute data. Possible choices: daily, minute (or aliases d, m)

Returns:

status message

Return type:

dict

Notes

Usage Guide:

Examples

Create a minute data bundle for all US stocks:

>>> create_usstock_bundle("usstock-1min")

Create a bundle for daily data only:

>>> create_usstock_bundle("usstock-1d", data_frequency="daily")

Create a minute data bundle based on a universe:

>>> create_usstock_bundle("usstock-tech-1min", universes="us-tech")

Create a minute data bundle of free sample data:

>>> create_usstock_bundle("usstock-free-1min", free=True)
quantrocket.zipline.create_sharadar_bundle(code, sec_types=None, free=False)

Create a Zipline bundle of daily data for Sharadar stocks and/or ETFs.

This function defines the bundle parameters but does not ingest the actual data. To ingest the data, see ingest_bundle.

Parameters:
  • code (str, required) – the code to assign to the bundle (lowercase alphanumerics and hyphens only)

  • sec_types (list of str, optional) – limit to these security types. Possible choices: STK, ETF. Default is to include both stocks and ETFs.

  • free (bool) – limit to free sample data

Returns:

status message

Return type:

dict

Notes

Usage Guide:

Examples

Create a bundle for all Sharadar stocks and ETFs:

>>> create_sharadar_bundle("sharadar-1d")

Create a bundle for ETFs only:

>>> create_sharadar_bundle("sharadar-etf-1d", sec_types="ETF")

Create a bundle of free sample data:

>>> create_sharadar_bundle("sharadar-free-1d", free=True)
quantrocket.zipline.create_bundle_from_db(code, from_db, calendar, start_date=None, end_date=None, universes=None, sids=None, exclude_universes=None, exclude_sids=None, fields=None)

Create a Zipline bundle from a history database or real-time aggregate database.

You can ingest 1-minute or 1-day databases.

This function defines the bundle parameters but does not ingest the actual data. To ingest the data, see ingest_bundle.

Parameters:
  • code (str, required) – the code to assign to the bundle (lowercase alphanumerics and hyphens only)

  • from_db (str or list of str, required) – the code(s) of one or more history databases or real-time aggregate databases to ingest. If multiple databases are specified, they must have the same bar size and the same fields. If a security is present in multiple databases, the first database’s values will be used.

  • calendar (str, required) – the name of the calendar to use with this bundle (provide ‘?’ or any invalid calendar name to see available choices)

  • start_date (str (YYYY-MM-DD), required) – limit to historical data on or after this date. This parameter is required and also determines the default start date for backtests and queries.

  • end_date (str (YYYY-MM-DD), optional) – limit to historical data on or before this date

  • universes (list of str, optional) – limit to these universes

  • sids (list of str, optional) – limit to these sids

  • exclude_universes (list of str, optional) – exclude these universes

  • exclude_sids (list of str, optional) – exclude these sids

  • fields (dict, optional) – mapping of Zipline fields (open, high, low, close, volume) to db fields. Defaults to mapping Zipline ‘open’ to db ‘Open’, etc.

Returns:

status message

Return type:

dict

Notes

Usage Guide:

Examples

Create a bundle from a history database called “es-fut-1min” and name it like the history database:

>>> create_bundle_from_db("es-fut-1min", from_db="es-fut-1min", calendar="us_futures")

Create a bundle named “usa-stk-1min-2017” for ingesting a single year of US 1-minute stock data from a history database called “usa-stk-1min”:

>>> create_bundle_from_db("usa-stk-1min-2017", from_db="usa-stk-1min",
>>>                      calendar="XNYS",
>>>                      start_date="2017-01-01", end_date="2017-12-31")

Create a bundle from a real-time aggregate database and specify how to map Zipline fields to the database fields:

>>> create_bundle_from_db("free-stk-1min", from_db="free-stk-tick-1min",
>>>                       calendar="XNYS", start_date="2020-06-01",
>>>                       fields={
>>>                           "close": "LastPriceClose",
>>>                           "open": "LastPriceOpen",
>>>                           "high": "LastPriceHigh",
>>>                           "low": "LastPriceLow",
>>>                           "volume": "VolumeClose"})
quantrocket.zipline.ingest_bundle(code, sids=None, universes=None)

Ingest data into a previously defined bundle.

Parameters:
  • code (str, required) – the bundle code

  • sids (list of str, optional) – limit to these sids, overriding stored config

  • universes (list of str, optional) – limit to these universes, overriding stored config

Returns:

status message

Return type:

dict

Notes

Usage Guide:

Examples

Ingest data into a bundle called usstock-1min:

>>> ingest_bundle("usstock-1min")
quantrocket.zipline.list_bundles()

List available data bundles and whether data has been ingested into them.

Returns:

data bundles and whether they have data (True indicates data, False indicates config only)

Return type:

dict

Notes

Usage Guide:

quantrocket.zipline.get_bundle_config(code)

Return the configuration of a bundle.

Parameters:

code (str, required) – the bundle code

Returns:

config

Return type:

dict

Notes

Usage Guide:

quantrocket.zipline.drop_bundle(code, confirm_by_typing_bundle_code_again=None)

Delete a bundle.

Parameters:
  • code (str, required) – the bundle code

  • confirm_by_typing_bundle_code_again (str, required) – enter the bundle code again to confirm you want to drop the bundle, its config, and all its data

Returns:

status message

Return type:

dict

Notes

Usage Guide:

Examples

Delete a bundle called ‘es-fut-1min’:

>>> drop_bundle("es-fut-1min")
quantrocket.zipline.get_default_bundle()

Return the current default bundle, if any.

Returns:

default bundle

Return type:

dict

Notes

Usage Guide:

quantrocket.zipline.set_default_bundle(bundle)

Set the default bundle to use for backtesting and trading.

Setting a default bundle is a convenience and is optional. It can be overridden by manually specifying a bundle when backtesting or trading.

Parameters:

bundle (str, required) – the bundle code

Returns:

status message

Return type:

dict

Notes

Usage Guide:

quantrocket.zipline.download_bundle_file(code, filepath_or_buffer=None, start_date=None, end_date=None, data_frequency=None, universes=None, sids=None, exclude_universes=None, exclude_sids=None, times=None, fields=None)

Query minute or daily data from a Zipline bundle and download to a CSV file.

Parameters:
  • code (str, required) – the bundle code

  • filepath_or_buffer (str or file-like object) – filepath to write the data to, or file-like object (defaults to stdout)

  • start_date (str (YYYY-MM-DD), optional) – limit to history on or after this date

  • end_date (str (YYYY-MM-DD), optional) – limit to history on or before this date

  • data_frequency (str, optional) – whether to query minute or daily data. If omitted, defaults to minute data for minute bundles and to daily data for daily bundles. This parameter only needs to be set to request daily data from a minute bundle. Possible choices: daily, minute (or aliases d, m).

  • universes (list of str, optional) – limit to these universes

  • sids (list of str, optional) – limit to these sids

  • exclude_universes (list of str, optional) – exclude these universes

  • exclude_sids (list of str, optional) – exclude these sids

  • times (list of str (HH:MM:SS), optional) – limit to these times

  • fields (list of str, optional) – only return these fields (pass [‘?’] or any invalid fieldname to see available fields)

Return type:

None

See also

quantrocket.get_prices

load prices into a DataFrame

Notes

Usage Guide:

Examples

Load minute prices into pandas:

>>> download_bundle_file("usstock-1min", sids=["FIBBG12345"])
>>> prices = pd.read_csv(f, parse_dates=["Date"], index_col=["Field","Date"])

Isolate fields with .loc:

>>> closes = prices.loc["Close"]
quantrocket.zipline.backtest(strategy, data_frequency=None, capital_base=None, bundle=None, start_date=None, end_date=None, progress=None, params=None, filepath_or_buffer=None)

Backtest a Zipline strategy and write the test results to a CSV file.

The CSV result file contains several DataFrames stacked into one: the Zipline performance results, plus the extracted returns, transactions, positions, and benchmark returns from those results.

Parameters:
  • strategy (str, required) – the strategy to run (strategy filename without extension)

  • data_frequency (str, optional) – the data frequency of the simulation. Possible choices: daily, minute (or aliases d, m). Default is minute.

  • capital_base (float, optional) – the starting capital for the simulation (default is 1e6 (1 million))

  • bundle (str, optional) – the data bundle to use for the simulation. If omitted, the default bundle (if set) is used.

  • start_date (str (YYYY-MM-DD), optional) – the start date of the simulation (defaults to the bundle start date)

  • end_date (str (YYYY-MM-DD), optional) – the end date of the simulation (defaults to today)

  • progress (str, optional) – log backtest progress at this interval (use a pandas offset alias, for example “D” for daily, “W” for weeky, “M” for monthly, “A” for annually)

  • params (dict of PARAM:VALUE, optional) – one or more strategy parameters (defined as module-level attributes in the algo file) to modify on the fly before backtesting (pass as {param:value}).

  • filepath_or_buffer (str or file-like object, optional) – the location to write the output file (omit to write to stdout)

Return type:

None

Notes

Usage Guide:

Examples

Run a backtest defined in momentum-pipeline.py and save to CSV, logging backtest progress at weekly intervals.

>>> backtest("momentum-pipeline", bundle="my-bundle",
             start_date="2015-02-04", end_date="2015-12-31",
             progress="W",
             filepath_or_buffer="momentum_pipeline_results.csv")

Get a pyfolio tear sheet from the results:

>>> import pyfolio as pf
>>> pf.from_zipline_csv("momentum_pipeline_results.csv")
quantrocket.zipline.scan_parameters(strategy, data_frequency=None, capital_base=None, bundle=None, start_date=None, end_date=None, param1=None, vals1=None, param2=None, vals2=None, progress=None, params=None, num_workers=None, filepath_or_buffer=None)

Run a parameter scan for a Zipline strategy.

Returns a CSV of scan results which can be plotted with moonchart.ParamscanTearsheet.

Parameters:
  • strategy (str, required) – the strategy to run (strategy filename without extension)

  • data_frequency (str, optional) – the data frequency of the simulation. Possible choices: daily, minute (or aliases d, m). Default is minute.

  • capital_base (float, optional) – the starting capital for the simulation (default is 1e6 (1 million))

  • bundle (str, optional) – the data bundle to use for the simulation. If omitted, the default bundle (if set) is used.

  • start_date (str (YYYY-MM-DD), optional) – the start date of the simulation (defaults to the bundle start date)

  • end_date (str (YYYY-MM-DD), optional) – the end date of the simulation (defaults to today)

  • param1 (str, required) – the name of the parameter to test (a module-level attribute in the algo file)

  • vals1 (list of int/float/str/tuple, required) – parameter values to test (values can be ints, floats, strings, False, True, None, ‘default’ (to test current param value), or lists of ints/floats/strings)

  • param2 (str, optional) – name of a second parameter to test (for 2-D parameter scans)

  • vals2 (list of int/float/str/tuple, optional) – values to test for parameter 2 (values can be ints, floats, strings, False, True, None, ‘default’ (to test current param value), or lists of ints/floats/strings)

  • params (dict of PARAM:VALUE, optional) – one or more strategy parameters (defined as module-level attributes in the algo file) to modify on the fly before running the parameter scan (pass as {param:value}).

  • num_workers (int, optional) – the number of parallel workers to run. Running in parallel can speed up the parameter scan if your system has adequate resources. Default is 1, meaning no parallel processing.

  • progress (str, optional) – log backtest progress at this interval (use a pandas offset alias, for example “D” for daily, “W” for weeky, “M” for monthly, “A” for annually). This parameter controls logging in the underlying backtests; a summary of scan results will be logged regardless of this parameter. Using this parameter when num_workers is greater than 1 will result in messy and interleaved log output and is not recommended.

  • filepath_or_buffer (str, optional) – the location to write the output file (omit to write to stdout)

Return type:

None

Notes

Usage Guide:

Examples

Run a parameter scan for a moving average strategy called dma, then view a tear sheet of the results:

>>> from moonchart import ParamscanTearsheet
>>> scan_parameters("dma",
                    bundle="usstock-1min",
                    data_frequency="daily",
                    start_date="2015-01-03",
                    end_date="2022-06-30",
                    param1="MAVG_WINDOW",
                    vals1=[20, 50, 100],
                    filepath_or_buffer="dma_MAVG_WINDOW.csv")
>>> ParamscanTearsheet.from_csv("dma_MAVG_WINDOW.csv")

Run a 2-D parameter scan testing combinations of values for a long and short moving average, using 3 parallel worker processes:

>>> scan_parameters("dma",
                    bundle="usstock-1min",
                    data_frequency="daily",
                    start_date="2015-01-03",
                    end_date="2022-06-30",
                    param1="LONG_MAVG_WINDOW",
                    vals1=[100, 200],
                    param2="SHORT_MAVG_WINDOW",
                    vals2=[20, 50],
                    num_workers=3,
                    filepath_or_buffer="dma_LONG_MAVG_WINDOW_and_SHORT_MAVG_WINDOW.csv")
>>> ParamscanTearsheet.from_csv("dma_LONG_MAVG_WINDOW_and_SHORT_MAVG_WINDOW.csv")
quantrocket.zipline.create_tearsheet(infilepath_or_buffer, outfilepath_or_buffer=None)

Create a pyfolio PDF tear sheet from a Zipline backtest result.

Parameters:
  • infilepath_or_buffer (str, required) – the CSV file from a Zipline backtest (specify ‘-’ to read file from stdin)

  • outfilepath_or_buffer (str or file-like, optional) – the location to write the pyfolio tear sheet (write to stdout if omitted)

Return type:

None

quantrocket.zipline.trade(strategy, bundle=None, account=None, data_frequency=None, dry_run=False)

Trade a Zipline strategy.

Parameters:
  • strategy (str, required) – the strategy to run (strategy filename without extension)

  • bundle (str, optional) – the data bundle to use. If omitted, the default bundle (if set) is used.

  • account (str, optional) – the account to run the strategy in. Only required if the strategy is allocated to more than one account in quantrocket.zipline.allocations.yml.

  • data_frequency (str, optional) – the data frequency to use. Possible choices: daily, minute (or aliases d, m). Default is minute.

  • dry_run (bool) – write orders to file instead of sending them to the blotter. Orders will be written to /codeload/zipline/{strategy}.{account}.orders.{date}.csv. Default is False, meaning orders will be sent to the blotter and not written to file.

Returns:

status message

Return type:

dict

Notes

Usage Guide:

Examples

Trade a strategy defined in momentum-pipeline.py:

>>> trade("momentum-pipeline", bundle="my-bundle")
quantrocket.zipline.list_active_strategies()

List actively trading Zipline strategies.

Returns:

dict of account: strategies

Return type:

dict

Notes

Usage Guide:

quantrocket.zipline.cancel_strategies(strategies=None, accounts=None, cancel_all=False)

Cancel actively trading strategies.

Parameters:
  • strategies (list of str, optional) – limit to these strategies

  • accounts (list of str, optional) – limit to these accounts

  • cancel_all (bool) – cancel all actively trading strategies

Returns:

dict of actively trading strategies after canceling

Return type:

dict

Notes

Usage Guide:

Examples

Cancel a single strategy:

>>> cancel_strategies(strategies="momentum-pipeline")

Cancel all strategies:

>>> cancel_strategies(cancel_all=True)
class quantrocket.zipline.ZiplineBacktestResult

Convenience class for parsing a CSV result file from a Zipline backtest into a variety of useful DataFrames, which can be passed to pyfolio or inspected by the user.

Notes

Usage Guide:

Examples

Run a Zipline backtest and parse the CSV results:

>>> f = io.StringIO()
>>> backtest("momentum_pipeline.py",
             bundle="etf-sampler-1d",
             start="2015-02-04",
             end="2015-12-31",
             filepath_or_buffer=f)
>>> zipline_result = ZiplineBacktestResult.from_csv(f)

The ZiplineBacktestResult object contains returns, positions, transactions, benchmark_returns, and the performance DataFrame.

>>> print(zipline_result.returns.head())
>>> print(zipline_result.positions.head())
>>> print(zipline_result.transactions.head())
>>> print(zipline_result.benchmark_returns.head())
>>> print(zipline_result.perf.head())

The outputs are ready to be passed to pyfolio:

>>> pf.create_full_tear_sheet(
        zipline_result.returns,
        positions=zipline_result.positions,
        transactions=zipline_result.transactions,
        benchmark_rets=zipline_result.benchmark_returns)
 

Zipline API

Resource Group

Bundles

Create Bundle
PUT/zipline/bundles/{code}{?ingest_type,sids,universes,exclude_sids,exclude_universes,sec_types,free,start_date,end_date,from_db,calendar,fields}

Create a Zipline bundle.

This endpoint defines the bundle parameters but does not ingest the actual data. To ingest the data, see the ingestion endpoint.

Not all parameters are applicable to all ingestion types. Please see the Python API reference to determine which parameters are applicable to which ingestion types.

Example URI

PUT http://houston/zipline/bundles/usstock-1min?ingest_type=usstock&sids=FI12345&universes=us-tech&exclude_sids=FI23456&exclude_universes=other-universe&sec_types=STK&free=false&start_date=2010-01-01&end_date=2019-01-01&from_db=cme-fut-1min&calendar=us_futures&fields=close:Close
URI Parameters
code
str (required) Example: usstock-1min

the code to assign to the bundle (lowercase alphanumerics and hyphens only)

ingest_type
str (required) Example: usstock

the type of ingestion

Choices: usstock sharadar from_db

from_db
str (optional) Example: cme-fut-1min

the code(s) of one or more history databases or real-time aggregate databases to ingest. If multiple databases are specified, they must have the same bar size and the same fields. If a security is present in multiple databases, the first database’s values will be used.

calendar
str (optional) Example: us_futures

the name of the calendar to use with this bundle

sids
str (optional) Example: FI12345

limit to these sids (pass multiple times for multiple sids)

universes
str (optional) Example: us-tech

limit to these universes (pass multiple times for multiple universes)

exclude_sids
str (optional) Example: FI23456

exclude these sids (pass multiple times for multiple sids)

exclude_universes
str (optional) Example: other-universe

exclude these universes (pass multiple times for multiple universes)

sec_types
str (optional) Example: STK

limit to these security types. Applicable to Sharadar bundle only. Default is to include both stocks and ETFs.

Choices: STK ETF

free
bool (optional) Example: false

limit to free sample data

start_date
str (required) Example: 2010-01-01

limit to historical data on or after this date. This parameter is required and also determines the default start date for backtests and queries.

end_date
str (optional) Example: 2019-01-01

limit to historical data on or before this date

fields
str (optional) Example: close:Close

mapping of Zipline fields (open, high, low, close, volume) to db fields. Defaults to mapping Zipline ‘open’ to db ‘Open’, etc.

Response  200
Headers
Content-Type: application/json
Body
{
  "status": "success",
  "msg": "successfully created usstock-1min bundle"
}

List Bundles
GET/zipline/bundles/

List available data bundles and whether data has been ingested into them.

Example URI

GET http://houston/zipline/bundles/
Response  200
Headers
Content-Type: application/json
Body
{
  "usstock-1min": true
}

Delete Bundle
DELETE/zipline/bundles/{code}{?confirm_by_typing_bundle_code_again}

Delete a bundle.

Example URI

DELETE http://houston/zipline/bundles/usstock-1min?confirm_by_typing_bundle_code_again=usstock-1min
URI Parameters
code
str (required) Example: usstock-1min

the bundle code

confirm_by_typing_bundle_code_again
str (required) Example: usstock-1min

enter the bundle code again to confirm you want to drop the bundle, its config, and all its data

Response  200
Headers
Content-Type: application/json
Body
{
  "status": "deleted usstock-1min bundle"
}

Bundle Config

Get Bundle Config
GET/zipline/bundles/config/{code}

Return the configuration of a bundle.

Example URI

GET http://houston/zipline/bundles/config/usstock-1min
URI Parameters
code
str (required) Example: usstock-1min

the bundle code

Response  200
Headers
Content-Type: application/json
Body
{
  "ingest_type": "usstock",
  "sids": null,
  "universes": null,
  "free": false,
  "data_frequency": "minute",
  "calendar": "XNYS",
  "start_date": "2007-01-03"
}

Ingestions

Ingest Bundle
POST/zipline/ingestions/{code}{?sids,universes}

Ingest data into a previously defined bundle.

Example URI

POST http://houston/zipline/ingestions/usstock-1min?sids=FI123456&universes=nyse-stk-pharma
URI Parameters
code
str (required) Example: usstock-1min

the bundle code

sids
str (optional) Example: FI123456

limit to these sids, overriding stored config (pass multiple times for multiple sids)

universes
str (optional) Example: nyse-stk-pharma

limit to these universes, overriding stored config (pass multiple times for multiple universes)

Response  200
Headers
Content-Type: application/json
Body
{
  "status": "the data will be ingested asynchronously"
}

Zipline Config

Get Default Bundle
GET/zipline/config

Return the current default bundle, if any.

Example URI

GET http://houston/zipline/config
Response  200
Headers
Content-Type: application/json

Set Default Bundle
PUT/zipline/config{?default_bundle}

Set the default bundle to use for backtesting and trading.

Setting a default bundle is a convenience and is optional. It can be overridden by manually specifying a bundle when backtesting or trading.

Example URI

PUT http://houston/zipline/config?default_bundle=usstock-1min
URI Parameters
default_bundle
str (required) Example: usstock-1min

the bundle code

Response  200
Headers
Content-Type: application/json

Bundle Data

Query Bundle Data
GET/zipline/bundles/data/{code}.csv{?start_date,end_date,sids,universes,exclude_sids,exclude_universes,times,fields,data_frequency}

Query minute or daily data from a Zipline bundle and download to a CSV file.

Example URI

GET http://houston/zipline/bundles/data/usstock-1min.csv?start_date=2016-06-01&end_date=2017-06-01&sids=FI12345&universes=japan-bank&exclude_sids=FI23456&exclude_universes=other-universe&times=09:30:00&fields=Close&data_frequency=minute
URI Parameters
code
str (required) Example: usstock-1min

the bundle code

start_date
str (optional) Example: 2016-06-01

limit to history on or after this date

end_date
str (optional) Example: 2017-06-01

limit to history on or before this date

universes
str (optional) Example: japan-bank

limit to these universes (default is to return all securities in database) (pass multiple times for multiple universes)

sids
str (optional) Example: FI12345

limit to these sids (pass multiple times for multiple sids)

exclude_universes
str (optional) Example: other-universe

exclude these universes (pass multiple times for multiple universes)

exclude_sids
str (optional) Example: FI23456

exclude these sids (pass multiple times for multiple sids)

times
str (optional) Example: 09:30:00

limit to these times (pass multiple times for multiple times)

fields
str (optional) Example: Close

only return these fields

data_frequency
str (optional) Example: minute

whether to query minute or daily data. If omitted, defaults to minute data for minute bundles and to daily data for daily bundles. This parameter only needs to be set to request daily data from a minute bundle.

Choices: daily minute

Response  200
Headers
Content-Type: text/csv
Body
Sid,Date,Close
FI1715006,2017-01-27T09:30:00,48.65
FI1715006,2017-01-30T09:30:00,47.67
FI1715006,2017-01-31T09:30:00,48.97
FI1715006,2017-02-01T09:30:00,49.26

Backtests

Run Backtest
POST/zipline/backtests/{strategy}{?data_frequency,capital_base,bundle,start_date,end_date,progress,params}

Run a Zipline backtest and write the test results to a CSV file.

The CSV result file contains several DataFrames stacked into one: the Zipline performance results, plus the extracted returns, transactions, positions, and benchmark returns from those results.

Example URI

POST http://houston/zipline/backtests/dual_moving_average.py?data_frequency=minute&capital_base=100000&bundle=usstock-1min&start_date=2010-01-01&end_date=2020-01-01&progress=M&params=MAVG_WINDOW:20
URI Parameters
strategy
str (required) Example: dual_moving_average.py

the file that contains the strategy to run

data_frequency
str (optional) Example: minute

the data frequency of the simulation

Choices: daily minute

capital_base
float (optional) Example: 100000

the starting capital for the simulation (default is 10000000.0)

bundle
str (optional) Example: usstock-1min

the data bundle to use for the simulation. If omitted, the default bundle (if set) is used.

start_date
str (required) Example: 2010-01-01

the start date of the simulation

end_date
str (required) Example: 2020-01-01

the end date of the simulation

progress
str (optional) Example: M

log backtest progress at this interval (use a pandas offset alias, for example “D” for daily, “W” for weeky, “M” for monthly, “A” for annually)

params
str (optional) Example: MAVG_WINDOW:20

one or more strategy parameters (defined as module-level attributes in the algo file) to modify on the fly before backtesting (pass as param:value)

Response  200
Headers
Content-Type: text/csv
Body
dataframe,index,date,column,value
benchmark,1,2010-01-05 00:00:00+00:00,benchmark,0.0016353229762877675
benchmark,2,2010-01-06 00:00:00+00:00,benchmark,-0.015836734693877474
transactions,754,2014-12-30 21:00:00+00:00,price,112.5200000000018
transactions,754,2014-12-30 21:00:00+00:00,sid,Equity(265598 [AAPL])
transactions,754,2014-12-30 21:00:00+00:00,symbol,Equity(265598 [AAPL])

Parameter Scans

Run Parameter Scan
POST/zipline/paramscans/{strategy}{?data_frequency,capital_base,bundle,start_date,end_date,param1,vals1,param2,vals2,progress,params,num_workers}

Run a parameter scan for a Zipline strategy. Returns a CSV of scan results which can be plotted with moonchart.ParamscanTearsheet.

Example URI

POST http://houston/zipline/paramscans/dual_moving_average.py?data_frequency=minute&capital_base=100000&bundle=usstock-1min&start_date=2010-01-01&end_date=2020-01-01&param1=SMAVG_WINDOW&vals1=20&param2=LMAVG_WINDOW&vals2=180&progress=M&params=MAVG_WINDOW:20&num_workers=2
URI Parameters
strategy
str (required) Example: dual_moving_average.py

the file that contains the strategy to run

data_frequency
str (optional) Example: minute

the data frequency of the simulation

Choices: daily minute

capital_base
float (optional) Example: 100000

the starting capital for the simulation (default is 10000000.0)

bundle
str (optional) Example: usstock-1min

the data bundle to use for the simulation. If omitted, the default bundle (if set) is used.

start_date
str (required) Example: 2010-01-01

the start date of the simulation

end_date
str (required) Example: 2020-01-01

the end date of the simulation

param1
str (required) Example: SMAVG_WINDOW

the name of the parameter to test (a module-level attribute in the algo file)

vals1
str (required) Example: 20

parameter values to test (values can be ints, floats, strings, False, True, None, ‘default’ (to test current param value), or lists of ints/floats/strings) (pass multiple times for multiple values)

param2
str (optional) Example: LMAVG_WINDOW

name of a second parameter to test (for 2-D parameter scans)

vals2
str (optional) Example: 180

values to test for parameter 2 (values can be ints, floats, strings, False, True, None, ‘default’ (to test current param value), or lists of ints/floats/strings) (pass multiple times for multiple values)

params
str (optional) Example: MAVG_WINDOW:20

one or more strategy parameters (defined as module-level attributes in the algo file) to modify on the fly before running the parameter scan (pass as param:value)

num_workers
int (optional) Example: 2

the number of parallel workers to run. Running in parallel can speed up the parameter scan if your system has adequate resources. Default is 1, meaning no parallel processing.

progress
str (optional) Example: M

log backtest progress at this interval (use a pandas offset alias, for example “D” for daily, “W” for weeky, “M” for monthly, “A” for annually). This parameter controls logging in the underlying backtests; a summary of scan results will be logged regardless of this parameter. Using this parameter when num_workers is greater than 1 will result in messy and interleaved log output and is not recommended.

Response  200
Headers
Content-Type: text/csv

Tear Sheets

Create Tear Sheet
POST/zipline/tearsheets

Create a pyfolio PDF tear sheet from a Zipline backtest result.

Example URI

POST http://houston/zipline/tearsheets
Request
Headers
Content-Type: text/csv
Body
dataframe,index,date,column,value
benchmark,1,2010-01-05 00:00:00+00:00,benchmark,0.0016353229762877675
benchmark,2,2010-01-06 00:00:00+00:00,benchmark,-0.015836734693877474
transactions,754,2014-12-30 21:00:00+00:00,price,112.5200000000018
transactions,754,2014-12-30 21:00:00+00:00,sid,Equity(265598 [AAPL])
transactions,754,2014-12-30 21:00:00+00:00,symbol,Equity(265598 [AAPL])
Response  200
Headers
Content-Type: application/pdf

Trade

Trade Strategy
POST/zipline/trade/{strategy}{?bundle,account,data_frequency,dry_run}

Trade a Zipline strategy.

Example URI

POST http://houston/zipline/trade/dual_moving_average.py?bundle=usstock-1min&account=U12345&data_frequency=minute&dry_run=false
URI Parameters
strategy
str (required) Example: dual_moving_average.py

the file that contains the strategy to run

bundle
str (optional) Example: usstock-1min

the data bundle to use. If omitted, the default bundle (if set) is used.

account
str (optional) Example: U12345

the account to run the strategy in. Only required if the strategy is allocated to more than one account in quantrocket.zipline.allocations.yml.

data_frequency
str (optional) Example: minute

the data frequency to use. Default is minute.

Choices: daily minute

dry_run
bool (optional) Example: false

write orders to file instead of sending them to the blotter. Orders will be written to /codeload/zipline/{strategy}.{account}.orders.{date}.csv. Default is false, meaning orders will be sent to the blotter and not written to file.

Response  202
Headers
Content-Type: application/json
Body
{
  "status": "the strategy will be traded asynchronously"
}

List Strategies
GET/zipline/trade/

List actively trading Zipline strategies.

Example URI

GET http://houston/zipline/trade/
Response  200
Headers
Content-Type: application/json
Body
{
  "U12345": [
    "dual_moving_average.py"
  ]
}

Cancel Strategies
DELETE/zipline/trade/{?strategies,accounts,cancel_all}

Cancel actively trading strategies.

Example URI

DELETE http://houston/zipline/trade/?strategies=dual_moving_average.py&accounts=U12345&cancel_all=true
URI Parameters
strategies
str (optional) Example: dual_moving_average.py

limit to these strategies (pass multiple times for multiple strategies)

accounts
str (optional) Example: U12345

limit to these accounts (pass multiple times for multiple accounts)

cancel_all
bool (optional) Example: true

cancel all actively trading strategies

Response  200
Headers
Content-Type: application/json
Body
{}

alphalens

Performance analysis of alpha factors

moonchart

Moonchart tear sheets and performance analysis

moonshot

This API is for writing Moonshot strategies. For backtesting and live trading of Moonshot strategies, see the quantrocket.moonshot API.

pyfolio

Performance and risk analysis tear sheets

quantrocket.get_prices

trading_calendars

Trading calendars for QuantRocket-supported exchanges. See also the usage guide.

zipline

This API is for writing Zipline strategies. For backtesting and live trading of Zipline strategies, as well as managing data bundles, see the quantrocket.zipline API.