Product roadmap

Submit dataset requests and feature ideas here. For bug reports, use our chat support or issues tracker instead.

Trending
  1. Real-time and historical data for Kraken

    We've received some requests recently for Kraken data. Please upvote if this is of interest. We're still determining whether this is worth the risk.

    Eric M Duncan

    0

  2. Trading calendar information

    This feature would allow the user to request trading calendar information (such as trading session start/end times) via our API. This is especially useful when considering trading sessions that can span multiple UTC dates (and hence the possibility of having multiple trading sessions within a single day)

    Renan Gemignani

    5

  3. Consolidated US equities data

    Currently, equities is supported via individual prop feeds of each venue. While NASDAQ is sufficient for getting NBBO for most of the time, some users prefer something that will be more in line with actual NBBO from SIPs. This feature request tracks 3 possible modes of consolidation for both historical and live data: Databento server-side consolidation of multiple proprietary feeds Consolidated data from proprietary feed like Nasdaq Basic in lieu of SIP Consolidated data from CTA/UTP SIPs We plan on implementing 1-2 of these three options.

    Tessa Hollinger

    12

  4. Binance data (cryptocurrency spot, futures, options)

    We've received some requests recently for Binance data. Please upvote if this is of interest. We're still determining whether this is worth the risk.

    Christina Qi

    1

  5. Add Polars support to `to_df` method

    Could we add support to make the result of DBNStore.to_df a Polars dataframe as well? Perhaps the function signature could just be overloaded with a to_polars: bool argument. Something like: In Python @overload def to_df(self, to_polars: Literal[False]) -> pd.DataFrame: ... @overload def to_df(self, to_polars: Literal[True]) -> pl.DataFrame: ... Or, maybe to_df is split into two different functions to_pandas and to_polars. Either way, it would be helpful to avoid having to do pl.from_pandas(store.to_df().reset_index(drop=False)). Plus, Polars can convert to pyarrow-Pandas zero-copy, but not the other way around.

    Aidan L

    1

  6. An offical GO / Golang client library (similar to your existing python offeringthe python one you created)

    Hi there! Ive only been working with your service a short time, but im already commited to your platform. Python is a GREAT general purpose language. But I think i am justified when I say that for certain use cases, a "compile to native binary" language like Go is superior for the speed and simplicity (no package dependencies to manage, easier to build out dockerized/container applications) Justification: While there are other compile to native languages (RUST, Haskell, an Elixir i have heard of), Go has a significantly higher developer community. Its will be alot easier to crowdsource for bug, QA, enhancements and fuel adoption...I believe only the bareminimum functionality should be "made official" from you guys. While i think there are some things that should be made simpler than what you provide in your Databento Python SDK, I believe its more important to keep a consistant "bare bones functionality" across all your SDK offerings for accessing your core datasets GO (if you werent aware) allows exposing packages/modules that Python can consume. So you should (in theory) be able to design a top level/ 1 to many code base that can be re-purposed for python (and any other languages) ....Instead of maintaining 2 SDK's... you maintain the one... and port to the other via package exposure)...attached a screenshot (While you provide a pip install package, this approach would allow also make it easier for you to push to other package managers like homebrew, nuget, NPM....without having to refactor and maintain code bases for multiple languages.. Im a long time Go developer... So i 120% would like to volunteer to assist with development and design efforts. Ive already done quite a bit on my own project that I believe others would find useful ( I believe you'd also benefit from having an end user and consumer, aid in designing the final consumer product(s)... You know what they say about starship designers, never actually getting to go into space LOL) Long story short, an official Go SDK would offer "Minimal effort, maximum coverage" and enhanced scalability Thanks in advance!

    Terry J

    2

  7. Earnings release dates and estimates

    Note: This is not typically included in corporate actions (which are available) or fundamentals (https://roadmap.databento.com/b/n0o5prm6/feature-ideas/equities-reference-fundamental-and-static-data).

    Tessa Hollinger

    0

  8. Provide adjusted continuous contract

    Our continuous contract symbology does not behave the same as continuous contracts provided on retail charting apps, which create a continuous series by applying a constant offset on each rollover month to the lead month contract. Our philosophy is generally to provide raw prices because: (a) adjustments are opaque and may introduce vendor errors, (b) this gives you flexibility to do your own custom rollover adjustments, (c) the adjusted prices will throw off certain feature/signal calculations, (d) in practice you can't hold an instrument through the roll anyway, so adjusted prices may underestimate slippage that you'll experience from crossing the spread on the legs or the listed spread. (e) there's no single rollover rule that we expect to be preferred by all customers for all symbols, e.g. rollover rule for a symbol with term structure like SR3 or a physical commodity with seasonality and more than the common quarterly expiration schedule. However, we may consider providing something like this either: a) as a convenience feature to support legacy use cases that require adjusted continuous contracts b) as a code example to show how the user can compute the appropriate price adjustment themselves

    Tessa Hollinger

    2

  9. Real-time and historical index data

    Currently, indices are indirectly supported through tradable index instruments on CME futures, ETFs, etc. and we don't provide the index values (non-tradable) themselves. This may be sourced from a feed like the Cboe Global Indices Feed or NYSE Global Index Feed.

    Tessa Hollinger

    24

  10. CFE Book Depth

    Full depth of book feed for Cboe Futures Exchange (CFE). CFE contains volatility futures and corporate bond index futures, such as VIX futures (VX, VXM).

    Zach Banks

    10

  11. Eurex EOBI dataset

    Data for Eurex, including all schemas (MBO, MBP, ohlcv, etc.).

    Renan Gemignani

    13

  12. Cboe FX ITCH (forex, foreign exchange)

    All orders plus last look quotes from 35 major banks and non-bank LPs, on one of the largest FX venues.

    Tessa Hollinger

    13

  13. WebSocket API for live data

    To extend support to browser-based applications.

    Tessa Hollinger

    5

  14. Official C# client library

    This client library makes all our historical and live features easier to integrate in C# on Windows, Linux, and Mac OS. C# is currently already supported through our HTTP API and Raw TCP protocol, which are both language-agnostic.

    Tessa Hollinger

    11

  15. Provide snapshots for historical and live data

    This serves as a master list of all other snapshot-like features on our roadmap. The scope of this ticket is potentially very large and ambiguous so we've broken this down into smaller tickets that you can follow separately. (Historical only) https://roadmap.databento.com/b/n0o5prm6/feature-ideas/add-historical-endpoint-for-latest-snapshot-of-any-schema. This would allow a user to get the latest published value of any given schema, within the boundaries allowed by licensing/entitlements/historical embargo window. The main benefit of this is for creating ticker tape or latest quote features, e.g. on a web app, after we start exposing intraday data over the historical/HTTP API (https://roadmap.databento.com/roadmap/expose-intraday-and-current-trading-session-historical-data-over-historical-http-api-and-clients). Likely endpoint names for this would be either timeseries.get_last or timeseries.get_snapshot. (Historical only) https://roadmap.databento.com/b/n0o5prm6/feature-ideas/provide-snapshots-as-of-specified-time-in-historical-api. Likely endpoint names for this would be either timeseries.get_last or timeseries.get_snapshot.(Live only) https://roadmap.databento.com/roadmap/add-periodic-mbo-book-snapshots-to-live-api. This allows a user to get the last published value of any given schema at a specified time. The main benefit of this would be to allow customers to subsample the data on server side and reduce cost, though the benefit is diminished with feature 5 on this list. Note that this would allow a user to emulate (1) relatively well since a user could potentially just pass in their current clock time or some time slightly ahead of the clock time. However, their underlying implementations would be different and (1) and (2) would likely be released separately. Likely endpoint names for this would be either timeseries.get_last_asof or `timeseries. (Live only) https://roadmap.databento.com/b/n0o5prm6/feature-ideas/allow-live-api-clients-to-request-for-mbo-snapshot-recovery. This provides resilience to gaps or data errors originating from Databento side. It could also be used for recovery of book state caused by client-side issues or disconnection, but would be less quick than feature (4) on this list.(Both historical and live) https://roadmap.databento.com/roadmap/fixed-interval-mbp-1-summaries-eg-1-minute-bbo-or-subsampled-bbo. The purpose of this is more to provide customers a convenience over fetching or subscribing MBP-1 and subsampling and forward filling the MBP-1 data themselves, which could be very expensive given the size of MBP-1 data and how the customer has no idea how far to look back for the "last" MBP-1 update prior to the 1 second or 1 minute refresh interval. Some of these are in development, hence the status of this entire ticket, however you should check on each individual one in case the specific feature you're looking for is still in Considering state.

    Tessa Hollinger

    7