Product roadmap
Submit requestSubmit dataset requests and feature ideas here. For bug reports, use our chat support or issues tracker instead.
Real-time and historical index data
Currently, indices are indirectly supported through tradable index instruments on CME futures, ETFs, etc. and we don't provide the index values (non-tradable) themselves. This may be sourced from a feed like the Cboe Global Indices Feed or NYSE Global Index Feed.
Tessa Hollinger31
Add dark mode
Original request from Juan Linares: "Great product but please add dark mode." There are two separate parts to this: Dark mode for the portal and main website (databento.com, databento.com/portal) Dark mode for the docs We can consider this only after Q1 2025 since we're doing a major rebranding of our website which is expected to finish by early April 2025. The new colors will make it easier for us to implement a dark mode.
Juan L3
Limited support for L2/L3/MBP-10/MBO on Standard plans
The legacy live usage-based plans allowed users to access L2/L3 data. However L2/L3 were pulled from the Standard plan. One possibility to increase the value of the Standard plan is to offer limited access to L2/L3, perhaps gated by a quota on symbol subscriptions per account, etc.
Tessa Hollinger4
Official C# client library
This client library makes all our historical and live features easier to integrate in C# on Windows, Linux, and Mac OS. C# is currently already supported through our HTTP API and Raw TCP protocol, which are both language-agnostic.
Tessa Hollinger16
Add Polars support to `to_df` method
Could we add support to make the result of DBNStore.to_df a Polars dataframe as well? Perhaps the function signature could just be overloaded with a to_polars: bool argument. Something like: In Python @overload def to_df(self, to_polars: Literal[False]) -> pd.DataFrame: ... @overload def to_df(self, to_polars: Literal[True]) -> pl.DataFrame: ... Or, maybe to_df is split into two different functions to_pandas and to_polars. Either way, it would be helpful to avoid having to do pl.from_pandas(store.to_df().reset_index(drop=False)). Plus, Polars can convert to pyarrow-Pandas zero-copy, but not the other way around.
Aidan L2
Canadian exchange (TSX) data
Equity data from TSX with broker IDs (TSX is one of the few markets out there with post-trade transparency)
Marius Z1
Kalshi data
Kalshi is a regulated exchange where you can trade on the outcome of real world events: https://kalshi.com/
Tessa Hollinger0
US mutual fund data
Point-in-time mutual fund data, including mutual fund performance, expenses, and other related information such as equity holdings.
Tessa Hollinger0
Support non-fatal errors in the live API
Currently any unresolved symbol results in the session ending and the connection being closed. The live API should have non-fatal errors for issues like an unsupported schema and symbols that fail to resolve. This is tracked internally as D-579
Carter Green3
Add futures listed in Montreal Exchange
Would be great to have futures listed in TMX, particularly the equity and bond futures (SXF, CGB, CGF etc.)
Sheikh S0
Split batch jobs by parent symbol
For futures & options dataset, it would be useful to have the option to split the files by parent symbol instead of by raw_symbol or instrument_id. This would have a more manageable number of files, and match some common access patterns. Internal tracking: D-3456
Zach Banks1
Improved or additional roll rule support
Report by Stefan: client.symbology.resolve( dataset="GLBX.MDP3", symbols=['ES.v.0', 'ES.v.1'], stype_in="continuous", stype_out="instrument_id", start_date="2025-01-01", end_date="2025-06-01", ) returns something like: {'result': {'ES.v.0': [{'d0': '2025-01-01', 'd1': '2025-03-19', 's': '5002'}, {'d0': '2025-03-19', 'd1': '2025-06-01', 's': '4916'}], 'ES.v.1': [{'d0': '2025-01-01', 'd1': '2025-03-19', 's': '4916'}, {'d0': '2025-03-19', 'd1': '2025-03-23', 's': '5002'}, {'d0': '2025-03-23', 'd1': '2025-06-01', 's': '14160'}]}, 'symbols': ['ES.v.0', 'ES.v.1'], 'stype_in': 'continuous', 'stype_out': 'instrument_id', 'start_date': '2025-01-01', 'end_date': '2025-06-01', 'partial': [], 'not_found': [], 'message': 'OK', 'status': 0} Per issue report: "ES.v.0 rolls from contract 5002 to 4916 on the 19th, however ES.v.1 doesn't roll from 4916 to 14160 on the same day. Instead, it switches temporarily to the former front contract 5002. The logic naively just sorts every day by volume." It's likely this will need to be addressed introducing another roll rule rather than changing the behavior of existing roll rules, since we suspect this will glitch on symbols like SR3, ZQ, GC, which don't automatically have a monotonic decay in volume/OI the further out you go on expiration month. This roadmap ticket also tracks the implementation of time-to-expiry style roll rules, e.g. roll on T-1, ..., T-7 of expiration, similar to that found on other institutional data providers like Bloomberg. See also: https://roadmap.databento.com/roadmap/position-limit-rollover-rule-for-continuous-contract-symbology
Tessa Hollinger0
cancel
steven m1
Sub-penny precision on DBEQ.BASIC TRF trades
I'm a current subscriber using DBEQ.BASIC for equity trade data. I'm working on retail order flow identification using the sub-penny method. I've noticed that trades reported through the TRF venues (publisher IDs 40, 41) in DBEQ.BASIC appear to be rounded to half-penny precision (e.g., $665.4550 instead of $665.4523). The fractional cent component is almost always exactly 0.50, which makes it impossible to classify retail buys vs sells. Meanwhile, the XNYS.PILLAR dataset does preserve finer sub-penny precision (I see fracs at 0.10, 0.20, 0.70, etc.), but that only covers NYSE-reported trades β not the bulk of off-exchange TRF flow where most PFOF/retail orders are reported. My questions: Is full sub-penny precision available anywhere in your datasets for TRF-reported trades? The FINN.NLS and FINY.TRADES datasets returned errors β are these planned for release?Does the EQUS.PLUS or EQUS.ALL dataset preserve full sub-penny pricing? EQUS.ALL returned a date range error and EQUS.PLUS was listed as unsupported.Is there a consolidated dataset (like DBEQ.BASIC but without the half-penny rounding) that I could use? Happy to pay for a higher-tier product if it exists. I'm specifically trying to identify retail order flow in real-time for SPY and other high-volume names using the sub-penny trade signature. Any guidance on which dataset or configuration would give me the precision I need would be greatly appreciated. Thanks, Daniel
Daniel V1
Implied and Historical Volatilities
It would be nice to have historical volatilities of the stock prices, and historical implied volatilities for options on stocks derived from the options market data. At least daily. Thank you in advance for consideration.
Milan K0