Hacker News new | past | comments | ask | show | jobs | submit login
Launch HN: OneChronos (YC S16) – Combinatorial auctions market for US equities
231 points by lpage on Feb 7, 2022 | hide | past | favorite | 118 comments
Hi HN—we're Kelly and Steve, co-founders of OneChronos (https://www.onechronos.com). OneChronos is a "Smart Market" for US equities—meaning we match counterparties using mathematical optimization instead of classical human auctioneer mechanics [1]. Our flavor of Smart Market—combinatorial auctions—lets users enter orders spanning multiple securities and specify matching preferences way beyond just price and quantity.

We didn't invent Smart Markets or combinatorial auctions. Roughly $1T/year flows through them in industries ranging from display advertising to telecommunications. The underlying theory was the subject of the 2020 Nobel Prize in Economic Sciences [2]. We're bringing them to capital markets, and we have both the customers and the regulatory clearance to do so. Our initial user base contains the household names cumulatively responsible for ≈70% of US equities trading volume.

Today's market structure costs institutional investors at least a trillion dollars annually. We'll go into the details below, but the big thing to understand is that mutual/pension/sovereign funds, 401K plans, and ETF managers pay the price, and ultimately it gets passed on to households. Given diverse investment time horizons and risk preferences, capital markets are not a zero-sum game, but the existing market structure makes it one. Any form of market friction that prevents mutually beneficial trades from happening is an economic loss. Our goal is to make a lot more mutually beneficial trades happen.

We started working on OneChronos as experienced traders and auction theorists. Even so, getting here has taken five years of iterating with customers, tackling two deep tech problems, and working through an involved regulatory process. We'll describe what's causing existing market friction, the solution, and why that solution is a significant technical lift.

When people hear about market friction and hidden costs, they usually think about low latency technology, market data, exchange fees, and predatory HFT practices. Those are significant, and yet they are rounding errors compared to others. The principal sources of market friction that we're attacking are bidders' inability to express economic complements (things that are worth more together than separately), substitutes (things with diminishing marginal utility that are replacements for each other) and non-price factors, and game-theoretic incentives against bidding "truthfully"—that is, against specifying how many units of a good you have and the highest price at which you'd buy or the lowest at which you'd sell them (your supply and demand curve). The most commonly proposed market structure "fixes," like single good periodic batch auctions and the IEX speed bump, don't address any of these.

Imagine that a buyer values two goods A and B at $10 for the package, but only $4 for each individually since they're complements. Similarly, a seller might unload the package for $8 while demanding $5 for each good individually. Both agents have "exposure risk" if A and B are bought and sold separately—they might get stuck with an incomplete package. No trade happens if the risk is high enough (buy at $4, sell at $5, no cross). But if they can trade the package atomically, there's a mutual win of $2 in gains from trade. Similar missed opportunities happen if agents only want A XOR B or have different prices for different counterparties (price discrimination). This game of imperfect information and missed opportunities plays out every day in capital markets globally.

The straightforward solution to these problems is called "Expressive Bidding"—the ability to communicate parametric bids to the auctioneer, e.g., buy at most one of {$10 for A and B, $4 for A, $4 for B} or sell at most two units of A, pricing it at $10 for counterparty C_1, $9 for C_2, or $8 for C_3. Given everyone's Expressive Bid and a well-chosen objective function, the auctioneer uses constrained optimization to clear the market and unlock efficiencies. Awesome. So why didn't this happen when markets first started going electronic?

General combinatorial auctions are isomorphic to weighted set packing. Clearing them is an NP-complete optimization problem. Finding feasible and near-optimal solutions at the speed and scale of capital markets is deep tech problem #1. Furthermore, bidding in combinatorial auctions can be challenging in both a computational and UX sense. Making it easy is deep tech problem #2.

We tackle problem #1 similarly to how game AI like AlphaZero and optimizers like AlphaFold work. The combination of deep learning, heuristics, and classical AI search techniques is powerful, and applying them to combinatorial auctions in novel ways is a core part of our IP. Problem #2 involves the magic of formal methods. Expressive Bidding users submit snippets of code (a functionally pure subset of OCaml/ReasonML) called a Proxy Bidder. These proxies are essentially functions mapping "proposals" (allocations of goods) to prices, e.g., f({2A, -B}) → -5, meaning that the bidder wants $5 for buying two units of A and selling one unit of B. Using formal methods, we turn Proxy Bidders into Expressive Bids that our optimizers can understand. You can see what that looks like here [3]. This approach is dead simple for end users, but it took years of collaborative R&D with our friends and formal methods legends at Imandra [4] to enable.

Not everyone needs to write or use Expressive Bids. For common use cases, we're offering pre-canned/forkable Expressive Bids for things like pairs trades and factor neutral portfolios. That aside, users who don't use Expressive Bidding still benefit from those who use it and create unique liquidity that doesn't exist on other trading venues. Our economic mechanism prevents "dark forest" scenarios in which Expressive Bidding has adversarial uses that detract from overall match quality. "Power users" can only benefit those who treat us like a vanilla trading venue—and each other.

We make money by charging a small commission in line with other venues ($0.0009) on each share traded. Longer-term, we're excited about a pricing model that balances computational resources used against liquidity contributed and compensates OneChronos based on how much value we add. Specifically, we'll measure how much notional dollar price improvement we generate for the market beyond what's generated by a "vanilla" double auction that we run in parallel (a neat trick enabled by Expressive Bidding–we can run an arbitrary number of auctions with different rulesets in parallel to measure relative performance). This approach aligns our incentives with our customers and eliminates fixed costs (which cause market friction) from trading.

Only FINRA registered broker-dealers connect to OneChronos directly. If you work at one, and you're not yet a subscriber, please get in touch. We love talking to both subscribers and their customers, and we'd love to hear from institutional investors looking to leverage OneChronos through their existing broker algo and DMA workflows. Retail customers will eventually access us through brokers that choose to allow it (PFOF is its own thing). In the meantime, stay tuned for other more decentralized asset classes :) You can reach us at info (at) onechronos.com.

And we're hiring! If you're passionate about deeply technical problems ranging from mechanism design to applying ML to combinatorial optimization to writing compilers and engineering sophisticated distributed systems to HFT tolerances, get in touch — careers (at) onechronos.com.

Steve and I will be online today and would love to talk about our technical challenges, auctions/mechanism design, market structure, and the future of OneChronos.

[1] https://en.wikipedia.org/wiki/Smart_market

[2] https://news.stanford.edu/2020/11/19/bid-picture-nobel-prize... - Paul Milgrom, one of the laureates, is the Chair of OneChronos Labs, our research arm.

[3] https://www.onechronos.com/docs/expressive/bidding-guide/#in...

[4] https://www.imandra.ai/




For those who want a rough programming analogy here -- this sounds like support for multi-row transactions in a SQL database where only single row edits were allowed before.

Now you can describe "buy goods A and B at $10 maximum, commit" and have the transaction either succeed or fail. Before you had to edit those rows individually and there's risk that you end up in a weird partial state, hence having to lower your bid to cover your risk.

Really exciting tech and it'll be great for these costs in market-making to be eliminated!


Love that analogy—thanks! Combinatorial auctions are in large part about ensuring atomicity. Traders call it legging risk. Auction theorists call it exposure risk.


Thanks! :) good analogy and spot on about the market-making costs - hedging atomically as an EMM could unlock quite a lot in the way of liquidity.


Fantastic analogy, thanks.


Thanks for the TLDR I was never going to make it through their Bastille of text


This is part of the reason I come to HN. A detailed post on a problem (and sometime solution) I am unaware of.

A newbie question. I understand how a pair tickets to say the superbowl would be more valuable than a single ticket, people want to go with their friend. Is there a practical example for equities? I will buy 100 shares $FB at $250 only if I can also get 50 shares of $SNAP at $35 at the sametime? If I can't get that combo, I will only pay $240 for $FB?

Is this aimed at equities that have less liquidity?


Great question! Complements in capital markets typically aren't as strong as other markets like event tickets. On the complements side, hedges are a good example. Market makers like banks are willing to quote much larger sizes for hedged transactions, e.g., an institution that wants to buy a large block of equity in one company while selling others with similar qualities as "factor hedges." The net notional changing hands is roughly the sum of the parts. Still, there's a big price difference between doing this trade atomically and as a series of transactions where the market maker has to "wear" the risk for some period.

Substitutes in capital markets are ubiquitous. There might be hundreds of candidate hedges in the example above, but given how trading workflows are, there's no way to communicate that amongst market participants. A market maker has no way of knowing if someone wants to buy SNAP (and thus potentially has market-moving information about it) or if they're using it as a hedge for a short position and would substitute something that the market maker wants to gross down on (and offer a more aggressive price because of that). As such, market making is a game of pricing under risk and uncertainty. Combinatorial auctions eliminate much of the uncertainty.


Do you envision scenarios where this new expressiveness is used in strategic but not market efficient ways? For instance, in the SNAP example, presumably the correlate to price improvement when purchasing as a substitutable hedge is a price premium when purchasing a specific equity, as market participants can deduce -- in a way that they previously could not -- that there's something inherent to SNAP that one (or the market) values. I don't know if it's possible under the mechanics of your ATS, but this seems to produce an incentive to obfuscate such a purchase of SNAP specifically, potentially in ways that detract from market efficiency. Am I off base? To put it more generally I wonder whether this higher dimensionality might not lead to more sophisticated game-theoretic posturing, rather than less.


This is a great question, and the full answer involves lots of mechanism design nuance. The short answer is that we have uniform clears per trading instrument, and bids are sealed, so there's no direct signaling game that would allow someone to try to pass off an alpha trade as a hedge. We plan to introduce a mechanism that will enable signaling through tokens (opaque identifiers). Bidders can attach whatever token they want, and other bidders can price discriminate against tokens (change their prices for, refuse to trade with, trade exclusively with) based on historical post-trade outcomes that we make known via an immutable audit trail. Participants can create and use tokens freely, so it's not segmentation. Instead, it's a means of inducing a repeated play game and a market for reputation.


The founders can answer better, but a lot of traders are not just looking at individual stocks but rather packages of stocks together.

Say you want to invest in the health + tech space, but there's some risk that covid ends and gyms come roaring back. So then you want to minimize the risk by putting some money in the gym industry as well -- getting an entire package of peloton, apple, and 24hourfitness stock is actually worth more to you than the individual stocks on their own.


I sent you an email, but all I can say is please come to Switzerland so I can work with you guys :-) Everything I've seen so far looks quite awesome.

Two questions:

1. Are you implying you are using deep learning heuristics for weighted set packing? Assuming you can't share too much about your IP, did you have a regulatory or business need to deal with worst-case performance guarantees and (how) did you manage this if you did?

2. It sounds like a lot of your stack is OCaml (I'm a fan, 2nd most fanboyed language after Rust and it's a pity it's not more used), is this a deliberate choice or a "grew out of a research project in formal verification where they like ML" consequence?


Thanks! I'll reply to you there as well.

1. There are two places where deep learning and prior-based approaches can come into play for combinatorial auctions. One is pretty analogous to AlphaZero, but substitute placing a piece on a Go board with accepting a bid, hoping that upon reaching a terminal state the set of bids accepted is feasible and close to optimal. The second is perhaps more in line with what you mentioned—using ML for hyperparameter selection in an algorithm portfolio. When we go live and have production data, our meta optimizer will measure how different approaches are doing and allocate computational resources accordingly in an online fashion. We always use a vanilla unit double auction as a baseline to measure relative performance within an auction cycle, and if the baseline is better, we use it instead.

2. There's a fun and serendipitous story here. I wrote an extremely early prototype as a tiny lisp and evaluator to go with it. We needed a very restricted and functionally pure language that we could control the execution context of, symbolically execute, and do basic formal methods on. The approach worked for a POC, but it was a far cry from real-world adoptable. We proceeded to prototype a DSL with an HM inspired type system and a more pythonic syntax, arriving at a poor man's ML. Better, but a DSL, and something limited/bespoke that would ultimately be annoying for developers. Then we met the guys at Imandra [1], who convinced us that we could have our cake and eat it too using vanilla OCaml/ReasonML and an ultra-high level theorem prover to keep code in an acceptable logic fragment. As an aside, rust is our systems PL and where we do most of the heavy lifting. Evaluating Expressive Bids isn't computationally expensive relative to the optimization problem.

[1]: https://www.imandra.ai/


Finally, something truly new, taking care of complexity for businesses for a win-win scenario, not aiming at buying users in batches with VC money and tracking them for life. I appreciate you being humble and honest that it is an application of known methods to a different problem. Good luck, let us know how to follow you for your latest achievements.


Thank you! We really appreciate that. We're happy to be working on something that has already worked incredibly well in other domains while facing major technical blockers against use in ours. It gives us a clear, albeit challenging, problem to tackle.


Auctions, deep learning, formal methods and discrete optimization, it seems like you got my list of things I want to learn and turned that into an amazing solution for a giant problem. Congratulations to the team, will be watching from afar and rooting for you!


All of our favorite things as well :)


What is your SLA for an expressive bid? I'm guessing it its less than 1ms?

Do you use a database of some sort?

How do you to handle settlement?

How do you handle ingest?


> What is your SLA for an expressive bid? I'm guessing it its less than 1ms?

The optimization procedure (which includes bid evaluation) is ~30ms. We cycle bound (under a formal model of computation via function application and graph reduction) computation of bidders to ensure that everyone shares an identical and deterministic resource cap.

> Do you use a database of some sort?

Not as part of the real-time trading system, which operates as a CP fail-stop distributed system model checked for safety and liveness by TLA+ and system tested by Jepsen.

> How do you to handle settlement?

Regular way (T+2 settlement with a 3rd party clearing BD)

> How do you handle ingest?

We use a constellation of GPS synchronized Stratum 1 clocks and proprietary network timestamping software + hardware to ensure that we process orders entered by the auction call time regardless of what physical host we receive the order on. We do the same for market data broadcast from other trading venues across data centers and geographies. We stream both market data and orders to a central point for processing. Every node in our distributed system that processes orders or “away venue” market data broadcasts a “Gateway Call Announcement (GCA)” message at auction call time to downstream compute nodes that run the auction. Auction solver nodes get to work after receiving GCA messages from the hosts they expect to hear from.


Could you use something like this to not rely on Stratum 1 clocks or have this as a backup? https://www.datacenterdynamics.com/en/news/facebook-creates-...


We're following the project with interest, but for now, we're focused on directing engineering resources to other portions of the stack. FWIW our approach is in the low nanos of precision and accuracy, which puts us within spitting distance of the Nyquist criterion for never aliasing two packets to the same timestamp (thus losing a total ordering) at line rate 10G. That's massive overkill for our purposes (auctions 100ms apart), but it's a satisfying property nonetheless :)


Well other than GPS being unavailable right ?

Another question are your looking into having same day settlement?


Since the GPS signal is just disciplining a local oscillator, it would have to be a sustained outage before drift starts to really matter. But yeah there is a point where it would make a difference.

> Same day settlement

This one is outside our control for the moment - we partner with a 3rd party for clearing and settlement, and would depend on our subscribers also making the switch to same-day.

Once we get into other asset classes, fast settlement is definitely of interest. Some cool stuff we could do with incorporating settlement instructions and/or counterparty risk constraints as part of the expressive bidding language.


Loved this from the first time I saw it some years back (became part of my example list of innovations out there)! Finally, some advanced auction mechanisms going broader.

From the description above: are you guys then just selectable as an algo/ATS going through through a broker, i.e, there could be a natural "sweep" (bit like algos covering block interest in some cases)? Do you work with some big broker-dealers on integration?

I know you started out with equities, but bond portfolio transitions are (often) a much bigger pain - any plans there? Or issuance, i.e., mix of funding instruments in one go?


> are you guys then just selectable as an algo/ATS going through through a broker, i.e, there could be a natural "sweep" (bit like algos covering block interest in some cases)?

Yes. Some brokers are incorporating us into their algo suite, others are offering us as a direct route, and most are doing both.

> Do you work with some big broker-dealers on integration?

Yes! We're excited to be launching with many of the household names, and most have plans to connect by early H2. We'll be updating our website with a list of launch partners in the coming weeks as part of our full launch announcement (the HN fam is hearing it first).

> I know you started out with equities, but bond portfolio transitions are a much bigger pain - any plans there? Or issuance, i.e., mix of funding instruments in one go?

Getting to this world state is our real passion. Imagine a fund manager running a cross-geography equities and a credit book. Any trade they want to do will involve rates and currency risk on top of the actual delta. We want to make it easy to, say, sell some European debt issuances in euros to fund a US equities position in dollars while re-hedging curve risk, all as part of one atomic and frictionless transaction with a pre-trade known cost basis.


Thank you! I reckon the fully integrated world also will need some sort of "darkness" layer (in parts) to not warp liquidity and quotes too much for the less liquid things - but I can see the tech and algorithms for that being there already , just not used much.

Looking forward to reading the full launch announcement!


Can you say more about how you incentivize truthful bidding? I understand about exposure risk and expressive bidding but I'm not sure if that was meant to obviously imply something about truthful bidding which went over my head, or if you meant to not say more about it.

I love the part about eventually determining your value-add by comparing to a counterfactual vanilla market -- sounds a bit like Shapley value? If not exactly Shapley value?


> Can you say more about how you incentivize truthful bidding? I understand about exposure risk and expressive bidding

Both the multiunit dynamics and the specifics of our uniform clearing price mechanic minimize ex-post regret. Double auctions suffer from the winners curse/adverse selection, as limit orders are always "traded through." Multiunit uniform clearing price mechanisms like OneChronos can lessen or eliminate that by incentivizing buyers and sellers to truthfully report aggregate supply and demand curves, and Expressive Bidding enables the reporting of supply and demand curves (among other things). NB: we are not an IC direct mechanism. We are balanced budget and individually rational.

I love the part about eventually determining your value-add by comparing to a counterfactual vanilla market -- sounds a bit like Shapley value? If not exactly Shapley value?

It's a hot take on both Shapley values and VCG (while avoiding the issues with both), and it's about to become an active area of research for us!


i'm not an economist or a game theorist so i don't remember the details but this paper talks about how certain market designs lead to untruthful bidding

https://www.cs.cmu.edu/~sandholm/vickrey.IJEC.pdf

but in the context of second price auctions.

lpage might be alluding to something having to do with their proxy bidder implementation but the above paper actually discusses how proxy bidders themselves lead to untruthful bidding (so maybe lpage is suggesting their implementation is better?).


VCGs got a real-world test in FB's ad market [1], and the results were mixed. VCG is in a class of theoretically interesting but fragile and overly game-theoretic mechanisms. Our mechanism is boring from a mechanism design standpoint—it's a uniform clearing price periodic auction without any cleaver demand reduction or tricks aimed at incentive compatibility. The complexity of what we allow for with the bidding language makes closed-form/theoretical analysis at best difficult and, in cases, impossible. Instead, we focus on giving traders a direct means to express their valuations and mechanism that minimizes information leakage and post-trade regret (situations where a bidder wishes they'd behaved differently given the auction's outcome).

[1] https://www.researchgate.net/profile/Alexander-Leo-Hansen/pu...


This seems like really deep technology. What is the one sentence describing who would use this and why though? Maybe that's in the long block of text somewhere.

I work in fintech for broker dealers so I'm genuinely curious what is the use case here.


> This seems like really deep technology.

That it is!

> What is the one sentence describing who would use this and why though?

Today's market structure costs institutional investors (and by extension households) at least a trillion dollars annually (and Smart Markets hold the potential to eliminate that loss).


Exciting implementation of a really cool concept.

> Today's market structure costs institutional investors (and by extension households) at least a trillion dollars annually (and Smart Markets hold the potential to eliminate that loss).

How does one get to this estimate? That is ~5% of US GDP. Everything else was easy to follow - this seemed high, at least intuitively.


It's a huge number that only makes context when looking at the absurd scale of capital markets globally. BlackRock has written on the cost of liquidity [1]. Unfortunately, much of the institutional research on this topic is in a walled garden, so we plan on publishing on this when we have our own data. Treating it as a Fermi problem, the market cap of US equities is ~50T and 140T notional of US equities traded in 2021. The global market cap is 125T (I don't have trading volumes there). FICC is much larger than equities.

Portfolio returns compound exponentially, so even small inefficiencies matter big time.

[1] https://www.blackrock.com/corporate/literature/whitepaper/vi...


You can get to big numbers on global capital markets, for sure. I was wondering whether you a consulting/VC-style estimate given how specific the statements was: "Smart Markets hold the potential to eliminate that loss" of "at least a trillion dollars annually".

How do you think about it? Let's say we expect half the benefit to come from equities.

>> 0.5T / 125 T = 0.004

>> Smart Markets would need to raise portfolio returns by an average of .4% (net of trading costs) annually.


Don't know if this is accurate but the entire online stock brokerage industry revenue appears to be about $14B.

That seems like a significantly lower upper bound to the market size here.

That said, what seems interesting here is to come up in advance with many potential arbitrages, and load them in advance for fulfillment if they occur. Risky but interesting than having to roll your own complex tool for this.

https://www.ibisworld.com/industry-statistics/market-size/on....


To clarify, 1T isn't what we're claiming as our revenue opportunity; it's what traders are missing out on annually in the form of portfolio returns due to market friction and missed Pareto outcomes.

(FWIW and not that it's the market that we're going after per se—our strategy is mostly blue ocean—the market for US equities electronic execution services across the whole stack of technology, market data, broker algos, etc., is $18B/yr.)


>Who would use this and why?

This isn't clear to me - are your buyer's institutional investors? Are they buying your technology to create trade options for their end users i.e. an individual investor? I don't know what an ATS is so I gather that I'm not a direct user of your technology - perhaps I would be an indirect user? Would E-Trade, for example, leverage your technology to provide me with a combinatorial buying option?


An ATS is like an exchange, so we match buyers and sellers. And you guessed correctly that the initial users are institutional investors - or more directly their brokers. So initially we'll have institutions creating and sending in "Expressive Bids" to improve their execution performance, and to express trades they currently can't via plain limit orders.

That said, we'd love to get to the point where E-Trade etc. are offering combinatorial bidding to retail traders, with us on the back end.


Thanks. Obviously I'm not a direct user of your technology and so maybe this is not intended for me but if you could translate your "A" and "B" into a hard, real-life example that I could understand I would be empowered to be an advocate for you. Best of luck to you.


No worries, happy to concretize this: the really easy example would be shoes. How much would you pay for just a right shoe or just a left shoe? A lot less than the pair, since you might not be able to find the other shoe in the right size, condition, etc. Same with the seller - they don't want to be stuck trying offload a single left shoe.

In stocks, A might be a company you invested in and B some ETF that you bought as a hedge for A. What if you sell out of A, and then the price of the ETF drops? There's value in being able to liquidate the full position - the single stock plus the hedge - at once.


Gut reaction:

There has to be a lot of additional data gathered at the time of 'intent to purchase or sell' - because otherwise your solution eats away a lot of powerful institutions alpha. And without 'novelly' expressive orders, there's no new place for them to go... that Trillion dollars doesn't just evaporate in today's world.


It's more like ensuring an opportunity to leverage the information that's already being gathered. As it stands, PMs have to construct concrete portfolios because they need to send the trading desk specific instructions on what to buy and sell. The portfolio they ship out for execution is effectively a low dimensional projection of a high dimensional decision process. That process has extensive substitutability (sizing and substitutability if something is going to be more or less expensive to execute than transaction cost models predicted), but there's no way to communicate that in today's trading workflows. That results in the market missing out on Pareto outcomes.

We've already seen this in sourcing markets [1]. Capturing more information at the time of bidding resulted in massive (40-60%) efficiency gains for both sides of the market.

[1]: https://kilthub.cmu.edu/articles/journal_contribution/Very-L...


Oh yes, I see the problem statement and agree from a PM perspective this is quite good.

That said, there are a lot of people who make good money making inferences from these current concrete dynamics - in some sense, you're just forcing the market to innovate (this is good).

I always like to know who I'm asking to change when building products -- and this one is a very interesting (read: fun and potentially lucrative) set of actors.


> a trillion dollars annually?

lol. source / evidence?


Fully appreciate that it's a very large sounding but very real number once you start unpacking the scale of capital markets: https://news.ycombinator.com/item?id=30247693


I am aware how big the markets are. I am also aware that transaction costs are miniscule.

By your own data above, if typical fees are $0.0009 per share traded, $1tr in costs implies notional value of instruments traded each year of approx $1x10^17, assuming average price of $100 / share.


Ah, agreed, but we're talking about two different things—direct transaction costs versus allocative inefficiency/missed Pareto outcomes. OneChronos is about unlocking Pareto efficiencies—situations in which two or more parties can trade to mutual benefit. An easy example is a (scaled down in price differences, scaled up in size) version of the complements example above, e.g., an ETF arb trading the basket against the underlying with a small tolerance for tracking error. An institution that can take the basket or the underlying as a hedge or as an investment position can interact with the arbitrager, creating economic gains for both parties in the process. At institutional scale, efficiency gains measured in bps and compounded exponentially add up.


> Today's market structure costs institutional investors (and by extension households) at least a trillion dollars annually (and Smart Markets hold the potential to eliminate that loss).

What is the subject and the verb for the problem you are solving and for whom? This is too vague.


How does this sit with other innovations such as all to all trading in OTC markets that are designed to match many buyer/sellers at, for example, a mutually beneficial mid price?


The products that still trade predominately {OTC, bilateral, non-electronic} do because non-price factors, trading conventions, and counterparty risk make it difficult or impossible to trade on a central exchange. Expressive Bidding and a mechanism that allows for matching market dynamics (OneChronos) will enable electronification and more active trading in these markets. Ten years ago, I would have said that there were commercial headwinds against this (dealers wanting to trade bilaterally). With banks increasingly focused on repeatable and less variable trading revs, such is no longer the case. The unsuitability of such products for double auctions is the limiting factor.


Thanks. It is worth noting a lot of traditional OTC products since 2008 crisis have moved to either electronic (exchange, ATS, MTF etc) with CLOB/RFQ/Auction style of execution and in some causes that's coupled with central clearing. A lot of this has come from regulation - DoddFrank, MiFID2 and its still on-going.

The most interesting aspect of this is that its enabled non-dealer <> non-dealer trading via certain venues.


This is a significant tailwind for the next wave of electronification that we're hoping to advance. US swap dealers, for example, now have a SEF reporting requirement, but most of the pre-trade is still in the screens. A Smart Market that allows dealers to control for non-price factors could change that.


Really cool stuff. Ease of use will be important.

The approach seems novel but conceptually these ideas exist in other parts of the market. Conditional and contigent orders have been around for a decade+. Options exchanges have complex order books. Supply/Demand curves have been modeled in cryptocurrency smart contracts like Uniswap.

Clearly this idea is different and novel but borrows somewhat from all these concepts.


Thanks, and with you 100%. That's why we've invested heavily in a more technically challenging for us but better for the end-user approach to Expressive Bidding. It's also worth noting two failed attempts at smart(er) markets—OptiMark and POSIT4. Both were ahead of their time and a little off the mark in aligning the mechanism with trader needs, but poor usability/high degrees of user-facing complexity didn't do them any favors.

> Conditional and contigent orders have been around for a decade+.

Conditional orders are a real testament to priorities shifting away from concerns over information leakage/fairness and towards concerns about how to get liquidity in an increasingly fragmented landscape. They're a nasty bandaid solution for routing opportunity cost.

> Options exchanges have complex order books

CLOBs for options and implied order books for futures dealt with some limited forms of exposure risk and proved a big boon for both markets, but they're also a bandaid solution (pre-defined packages and in some cases HFT scale legging risk and no ability to deal with substitutability or side constraints). We're excited to see what gains the general approach (combinatorial auctions unlock).

> Supply/Demand curves have been modeled in cryptocurrency smart contracts like Uniswap.

> Clearly this idea is different and novel but borrows somewhat from all these concepts.

While I agree with you that these are not new ideas, I'd phrase that a little differently and credit the actual inventors. Milgrom, Wilson, McAfee, Cramton, Ausubel, and too many others to name pioneered the theory and practice of multiunit and combinatorial auctions and, by extension, all of these concepts. Except for conditional orders, which make mechanism designers cry. There's truly nothing that better highlights the tragedy of the commons that is market fragmentation.


Congrats on the launch! What are the main benefits of this approach compared to creating additional combo (multi-leg) products on existing exchanges?

There are already a lot of mechanisms in traditional markets that deal with revealing or concealing true demand (e.g. block trades, icebergs, etc). Market-maker protections can allow outstanding orders to be cancelled if you get filled up to a predetermined risk setting. Most exchanges don't want to add additional complexity and more order types unless there's demand for it, which is presumably how market structure evolved to where it is today.

Combinatorial auctions are very interesting, but what's to stop exchanges from (1) creating more common bundles that people want to trade; (2) matching them with price-time priority so everyone gets a fair price? Wouldn’t the auction model just create wider or locked/crossed markets?


> Congrats on the launch! What are the main benefits of this approach compared to creating additional combo (multi-leg) products on existing exchanges?

Thanks! There are two main differences. For one, combos are (as the name suggests) predefined. That works reasonably well for products like futures and options where the 80/20 approach of making combos for somewhat structural ones like different expiries in the crude and eurodollar complex or packs and bundles designed as standalone financial instruments/hedges. An implied generational liquidity mechanism can knock out some basic structural price arbs between combos, resulting in a combinatorial auction approximation.

This approach falls apart when the combinations are very general as they are in the markets for equities, credit, and many of the assets that trade in the screens.

The CLOB/predefined bundle approach also doesn't address substitutability and non-price factors, and dealing with those is key to unlocking Pareto efficiencies.

> There are already a lot of mechanisms in traditional markets that deal with revealing or concealing true demand (e.g. block trades, icebergs, etc)

The problem with block trading venues and other approaches, e.g., conditionals, boils down to incentives. Initiators of block trades are usually going in the same direction, so opportunities for direct interaction/coincidence of wants are rare. And market makers don't want to take large deltas unless they can hedge and/or know the counterparty. The net effect is not much size getting done. Conditionals are a similar story to blocks. They don't have the opportunity cost that a firm block resting on a venue does, but there's information leakage, and the surface area for interaction is still small. Market makers aren't incentivized to provide liquidity, and directional traders are worried about/behave strategically due to concerns over information leakage.

> what's to stop exchanges from (1) creating more common bundles that people want to trade

I'd say that the market has already done this in the form of ETFs and index products and an entire ecosystem of ETF market making emerged around it.

> (2) matching them with price-time priority so everyone gets a fair price? Wouldn't the auction model just create wider or locked/crossed markets?

I'm not sure that I follow this part entirely. The uniform price combinatorial auction that we're running results in everyone getting the same price on a symbol-by-symbol basis. And, we view time priority as a bad thing (the arms race dynamic of time priority was known to practitioners since markets first started going electronic but Budish et al. were the first to write about it in detail). Periodic auctions have better fairness and post-trade mark outs theoretically and in practice. Some of the European venues where batch auctions have made limited inroads demonstrated this.


Agree with the combinations being more ephemeral outside of futures and options markets. I do wonder how much liquidity will exist for these combos, since even some of the popular structural ones don't have a ton of volume.

Regarding the last point, let's say hypothetically you create a market for "+100 FB shares, -500 SNAP shares". If everyone is competing on price to quote that combination, that creates the most competitive market. However, if there are many expressive bids with various conditions (e.g. minimum quantities, conditional on execution of another leg, etc), they may not get "implied" into creating a reasonable market, creating exponentially more arbitrage opportunities if they become locked/crossed. This adds a lot more complexity in calculating implied markets and matching them in a sensible way. With price-time priority, I agree that there are downsides as you mentioned, but it makes it easier to ensure the tightest spreads.


> let's say hypothetically you create a market for "+100 FB shares, -500 SNAP shares". If everyone is competing on price to quote that combination, that creates the most competitive market

Some combinatorial auctions make this tradeoff, packaging goods either to deal with computational limitations or to concentrate bidding on a few packages. It can work if there's near total consensus on what the economically relevant packages are, but it doesn't work otherwise. US equities is an "otherwise" case given a huge diversity of needs. The chance of someone wanting the opposite side of even a pairs trade at any given point in time is vanishingly rare. It's far more likely that the person doing the pair would interact with two or more counterparties independently actively interested in or willing to for the right price sell FB and buy SNAP (perhaps conditional on hedging). The mechanism design game is more about giving every party the tools they need to communicate their value function to the auctioneer and creating the incentives to bid (close to) truthfully.

> However, if there are many expressive bids with various conditions (e.g. minimum quantities, conditional on execution of another leg, etc), they may not get "implied" into creating a reasonable market, creating exponentially more arbitrage opportunities if they become locked/crossed.

And this is why combinatorial auctions are a global optimization (as opposed to implied, which are effectively an iterative and greedy approximation) and, in our case, one that seeks to find uniform clearing prices. There are formats with price discrimination and others in which mechanical arbitrage within an auction is possible, but ours is not one of them. There isn't a separate price for {A}, {B}, and {A, B} — within each auction there's a uniform price for p_a and p_b, and p_{a,b} = p_a + p_b (and so on for any arbitrary linear combination). Theoretical point: linear prices don't always exist (in practice, they do for interdependent value goods that aren't strongly sub or superadditive, e.g., capital market goods), and they're not inherently desirable. We chose linear pricing largely because it's a natural fit for how capital markets work now, and perceived fairness/simplicity is itself a valid mechanism design consideration.


This is very cool. In a past life, I built a quant + HFT MM trading firm where we did a lot of spread trading. Always thought something like this was needed! Good work


Thank you! That's my past life as well. The impedance mismatch between what I knew was possible on the market design front courtesy of my academic background and what I did day-to-day as an algo trader is part of the origin story. Steve and I want to figure out how we could bring Smart Markets to finance so that everyone could spend more time on alpha and less time on market structure workarounds.


Trying to get a feel for how much liquidity you need to successfully execute on the complex expressive orders that might tie in multiple securities... Is it common that these types of orders run for many auctions, day, days or more to be fulfilled?


Excellent question! The TL;DR is that substitutability and the ability to manage risk (e.g. execute hedges or constrain factor exposure atomically) can be great for inducing liquidity in the absence of large volume and high turnover.

On substitutability: if you want to sell $2m of some sector basket you wouldn't put in a limit order for $2m in every security (overfill risk). An expressive order can enforce a $2m global constraint across the basket but show full size in each security. When lots of people are doing this, it solves the "ships passing the night" problem where people are looking for offsetting exposure at a high level but can't express anything but single stock orders.

On risk management: some constraints certainly are restrictive, e.g. conjunctive constraints like 'a' AND 'b' AND 'c'. It would be very unlikely that we find the exact opposite of that constraint, so the auction is multilateral: we can stitch together the contra with individual single orders for 'a', 'b', 'c'. Key to the liquidity aspect of this is our objective function: it rewards more aggressive pricing and larger quantities. So the principle here is that by gaining atomicity (and reducing uncertainty) people can be more aggressive on price and qty. This is especially important for liquidity provision: how much larger size could market makers quote if they could automatically hedge new positions they enter into?

All that said, bootstrapping liquidity is the hardest part of any venue launch. We're obsessively focused on making sure we have the right blend of participants trading on different horizons for a healthy pool.


Really cool!

20 years ago my first job out of college was with a small company doing combinatorial auctions for the institutional bond market (among other industries). In the end the chicken/egg problem of having enough liquidity was too much to overcome.

At one point we pitched the NYSE on doing the opening call as a combinatorial auction but they were not interested.

We did have some success in pollution credit and trucking logistics markets though.

Of all the jobs I've had, that was easily the most fun/interesting algorithmically. I got to learn all about LP/MIP solvers, graph decomposition, distributed computation, etc.

Best of luck to you!


Thanks! We found that the "UX" aspect of fitting squarely into existing workflows to disrupt without disrupting is key—as is how we message the product. Both our focus on solving speed and Expressive Bids as code are "UX" decisions aimed at slotting us into existing market structure and mind space.

A surprising (to us) takeaway was that making a product in this space sound "vanilla"/undifferentiated is a good thing. Once folks aren't concerned about an initial integration being a lift, they're happy to onboard us as "just another trading venue" (but with great story about unique liquidity and match quality). Many then get excited about adopting the lowest hanging fruit incrementally for their specific use cases. And after peeling away a few layers of the onion, they get excited about the future state in which others do the same, and what initially seems like incremental change becomes a market structure transformation.

We studied the history of adoption in other markets where it's gone well (FCC spectrum, display advertising, procurement) and poorly (OptiMark, POSIT4—great attempts, ahead of their time, killed by complexity and subtle mismatches between the mechanism and market participant needs).


Is this only effective at small scale? Many of the biggest hedge funds in the world that do high frequency make money by effectively front-running the book. If you are executing these trades across exchanges I don't see how you don't get front-run by HF firms.

This is the same problem eth et al are dealing with in crypto swaps due to Miner Extracted Value (reordering the tx in the block to favor miners extracting value by front running trades).


The funny thing about the emergence of HFT is that if you truly only have a hundred or so shares to buy/sell it's quite cheap and easy to do that now. Atomicity and substitutability isn't as important if there's plenty of liquidity relative to the size you're trading.

The harder problem that large traders face is executing blocks and portfolio trades. How do you figure out what your total transaction cost (market impact / cost of liquidity) will be if you're buying 100x the displayed volume? Being able to express where you are flexible (e.g. individual security prices) and aren't flexible (aggregate price, atomicity) helps lock in the uncertainty pre-trade.

So we're actually mostly going after the large scale stuff, more than the small scale.


This would be far more impactful IMO for a market where price discovery is weak and market access is challenging. For example, equities in Mercosur countries.

Very cool regardless.


> This would be far more impactful IMO for a market where price discovery is weak and market access is challenging

We very much agree that the potential is even more significant for markets where opacity and price discovery is less efficient than US equities. We chose US equities as a beachhead given the mature regulatory framework and the extent of market fragmentation. Other asset classes and geographies are immediate next steps.


If you decide to go that route, hire me. I can be useful there.


Please email me! careers [at symbol] onechronos.com


congrats on the launch! We talked briefly a while back, and I've been checking in on your page every so often to see when things would finally get rolling. The world needs more mechanism design.

Recently in school I've been thinking a lot about constant-function market makers. it occurs to me that you can think of a constant-function market maker as being kind of like an expressive bid. That is, putting your assets in a CFMM is saying you're willing to make any trade among a bunch of assets subject to F(net amount of A, net amount of B, net amount of C,..., net amount of $) = k for some F. Regular limit orders are a special case.

Do you have a sense of how your expressive bids overlap with these? Can I cook up some expressive bid that's equivalent to putting assets in a CFMM? What would the restrictions on F be to make things work with your solvers?


Hey (I don't want to out your first name here), we should catch up! I'll email you after digging out the inbox.

Proxy Bidders are pure functions that map inputs (market conditions on other venues, metadata) to Expressive Bids in our bidding language (a bounded fragment of linear mixed real integer arithmetic logic—LIRA); certain EBs are CFMMs, and our most general solvers are SMT LIRA.


Do you do only equities or also derivatives?

This is very interesting. Because you run frequent short auctions, there's no strict long-running orderbook here, right? Are you using FIX for your protocol and where are your servers geographically located?


> Do you do only equities or also derivatives

Initially we're US equities only. Stay tuned for other asset classes and geographies. Spot vs derivatives is a core use case that we want to do as soon as we can (only national exchanges can do listed derivatives trades, so it's a big lift).

> there's no strict long-running orderbook here, right

The default good-till behavior is one auction cycle (100ms Poisson random back-to-back).

> Are you using FIX for your protocol

Yes! And, we have a formal model of our FIX spec and self cert flow that makes onboarding us way easier of a process than what's typical [1]

> Are you using FIX for your protocol and where are your servers geographically located?

Initially, just Equinix NY5. Longer term, we plan on PoPing at most financial data centers. We use a constellation of GPS synchronized Stratum 1 clocks and proprietary network timestamping software + hardware to ensure that we process orders entered by the auction call time regardless of what physical host we receive the order on. We do the same for market data broadcast from other trading venues across data centers and geographies. We stream both market data and orders to a central point for processing. Every node in our distributed system that processes orders or “away venue” market data broadcasts a “Gateway Call Announcement (GCA)” message at auction call time to downstream compute nodes that run the auction.

[1]: https://www.onechronos.com/docs/fix/fix-42/


Thank you for your answers.

Exciting stuff. And love your docs.


Very cool market. This feels a lot like multi-leg option execution.

How do you think market makers are going to react to this? It makes sense for them to provide a bid/ask on individual series, but how do you see them providing liquidity for these more complex orders?


> Very cool market. This feels a lot like multi-leg option execution.

Thanks! And yep, similar but all the way down to the venue/match level, e.g. as opposed to a broker taking on some legging risk to shield the end investor.

> How do you think market makers are going to react to this? Expanding on Kelly's take - the big thing it does for market makers is allow them to manage momentary risk. When a market maker gets filled on an exchange, they are immediately looking to hedge/offload what they took on which involves a sequence of transactions.

Here, the hedge is baked in. So for example, they may enter an order that looks like "buy and/or sell any mix of these 200 securities, if and only if the net change in risk (e.g. change in exposure across several factors) is within some tolerable distance from 0". So that would look like a traditional bid-ask spread across a series of symbols, but with a global exposure constraint. The key outcome being they can quote larger sizes across symbols safely.

NB: the MM doesn't need to know anything about the composition of the complex order on the other side. On top of that, they may be filling one leg, and a natural or other LP filling another etc..


Some EMMs are excited to leverage Expressive Bidding to quote more size with less risk. For example, I'll bid 200 shares of XYZ @ $10.00 XOR {1,000 shares ABC @ $9.99, -500 shares (beta hedge) DEF @ 19.98}.

Folks also plan on using Expressive Bidding to enter other business lines that high startup costs or low margin (ex ongoing technology costs) previously kept them out of.


>Furthermore, bidding in combinatorial auctions can be challenging in both a computational and UX sense.

Can you elaborate on how it's challenging in a UX sense? I'm curious to know what the challenges are.


How bidders communicate bids to the auctioneer (the bidding language) is a central concern for any auction. It's pretty straightforward for unit good auctions (I'll pay $5 for A or I'll sell B for $4), but combinatorial auctions involve arbitrary packages of goods; the set of all possible bids is the powerset of the goods being auctioned. Having bidders attach a value to each package is both a computational impossibility for anything more than a few goods and a "UX" nightmare. For example, a bidder that wants to buy A for at most $5 in a market for goods A and B with free disposal (meaning you'll take something extra if it's free) needs to enter the bid ({A, $5}, {AB, $5}).

Information theory tells us that no universal bidding language (there's a representation of any package of interest) is uniformly more compact than the power set representation. Nonetheless, a good bidding language makes "common" bids compact and easy to communicate. We thought about this problem deeply and realized that functionally pure computer programs mapping proposals (packages of goods) to valuations (how much the bidder will pay or would want to receive) are about as natural as it gets. There's a direct analog in asking a human or a pricing algo for a price in a bilateral trade setting. However, our optimizer doesn't know what to do with an arbitrary computer algorithm, and exhaustively querying one to get the power set of prices out is computationally infeasible. However, using formal methods, we can (in the right setting) convert a computer program into an equivalent representation in a logic fragment called mixed integer real arithmetic. And that (via SMT solving) is something that an optimizer can work with.

You can see what Proxy Bidders (the pure functions that create expressive bids) look like here [1].

[1]: https://www.onechronos.com/docs/expressive/bidding-guide/#in...


Thanks for the answer!


It is good to see methods routinely used in collateral trade matching found their way to close to real time exchange trade matching … though the former is a problem of a much larger sizes …


The notional values in compression cycles are insane, and we're interested in the post-trade space as well. The optimizations done on the post-trade and funding side aren't combinatorial auctions, resulting in efficiency loss and poses workflow challenges, especially for swap and CDS traders. That said, the opportunity for driving actual portfolio gains and not "just" minimizing counterparty and systemic risk is more significant on the pre-trade side.


Congrats on the launch! This is super interesting. Expressive bidding also opens up opportunities for arbitrage, do you see any potential downsides to this aspect of the market behavior?


Good question! Combinatorial auctions with a uniform clearing price eliminate mechanical arb; there's no opportunity to buy and sell a single trading instrument at different prices. Stat and funding arbs form "outside" of trading venues and they're a good thing that keeps the market efficient. Putting a combinatorial auction in the mix eliminates friction and entry costs for folks providing liquidity (no steep tech costs) and keeps the process competitive.


This makes a lot of sense in at least a few cases I can think of. Yes if I want to do a pair trade/arb at a spread it doesn’t make sense to try and execute both parts atomically ca interacting with another party directly looking to put on the inverse. No idea if the funky ai past that point adds value or is fluff but interested enough to find out.


poking around on your socials, it seems like you've been building for ~5 years, and are just now officially launching, after i guess a capital injection from yc.

since the core ip is "deep" as you say, i'm guessing it cost quite a bit to develop, unless you built out all of the components yourself, which, while possible, seems unlikely given the technical complexity of each piece (you, and whoever else is on the engineering team, seem smart but this looks like "research edge" tech along several dimensions).

so i'm curious whether you paid the development costs up front (either using your own money or FFF) or if you validated and raised in small pieces. if the latter, i'm curious how one does that for such a complex product/service.

lots of assumptions in the above - feel free to disabuse me of my ignorance.


> capital injection from YC

We raised a series A in 2019 led by Green Visor (who has been excellent btw, and with us from the start)

> it seems like you've been building for ~5 years ... i'm guessing it cost quite a bit to develop

Yep, you're spot on that it's a complex product. The biggest cost has been making it feel for the user like it's not. What that boils down to is an enormous amount of iterative feedback and development w/ the industry. Between that and the regulatory process, a lot of the "cost" has been more duration than cash burn. We've kept things lean from the start in anticipation of that.

> unless you built out all of the components yourself

We've developed the tech in house, with some hands on help from our friends at Imandra mentioned in the OP. On the research piece: that's been happening in the background for many years, and we're definitely building on the shoulders of giants in the worlds of mechanism design, algorithmic game theory, and deep learning. We're lucky to have some great academic advisors involved (like Kevin Leyton-Brown since the early days) as well.


>We've developed the tech in house

i'm not often impressed but that's quite impressive. kudos to you.

i currently work on deep learning compilers (as a phd student) but i'm interested in basically all of these things (compilers, combinatorial optimization, auction theory). i know lpage expressed that you're hiring but i'm curious what roles you're hiring for (your careers page is light on details).


We're still a small enough team that we're more focused on talent than roles. As an example of what that means, our stack is polyglot (rust, OCaml, elixir, python), and we don't assume or require that folks have worked in any of those languages before. We invest heavily in learning and teaching.

It sounds like you have a very relevant background, so please email us if you're interested in discussing further!


Are you expecting to only support institutional investors or retail order flow as well? It seems like this supports pushing the notional to shares conversion all the way down to the exchange, which I can see a use for.

How does the expressive bidding interact with NBBO held orders?


> Are you expecting to only support institutional investors or retail order flow as well?

We don't segment the market at all or exclude subscribers (beyond requiring that they're FINRA registered BDs), but the way that PFOF works means that we likely won't see retail order flow from the brokers that wholesale it. We do view Smart Markets as a win/win/win/win for retail customers, brokers, market makers, and regulators alike (cleaner routing, better transparency and price formation, better allocative outcomes, lower technology costs).

> How does the expressive bidding interact with NBBO held orders?

https://news.ycombinator.com/item?id=30257586

Expanding on that a bit—we have the standard Rule 611 requirements, meaning that we clear within the NBBO on a symbol by symbol basis. We don't route.


Why is price discrimination based on counterparty a desirable quality of a matching system?


Speaking more broadly than capital markets, it can result in more economically efficient outcomes, especially when some economic agents create disproportionately large negative externalities for others.

For OneChronos, we're a uniform price clearing mechanism that will eventually support submarkets, which can enable (among other things) what's effectively a mutual opt-in repeated play game of reputation and the ability to bid based on reputation. So we don't support any form of price discrimination at present, and what we will eventually support has the nuance to it that both parties chose to opt-in (and Expressive Bidding means that they can bid in multiple submarkets simultaneously without exposure risk). This induces a meta-game of sorts — a market for reputation. We're including it as a cleaner and more transparent version of existing behaviors with the efficiency gains to go with it.


I won't pretend to understand all of this. Despite having worked in systematic equities trading group, I don't have a deep understanding of market micro-structure.

I do appreciate your responding to my question.

I find the technology and ideas you're bringing to market incredibly fascinating. I can't wait to see what comes of this!


Thanks! I'll say that a key design goal of ours is to make it so that PMs/asset managers/execution traders running complex books don't have to think about microstructure. Today's markets and the algorithmic trading stacks surrounding them are enormously complex. Smart Markets and multiunit auctions trade external complexity and internal simplicity (the existing double auction world) for external simplicity/usability and internal complexity (the computational challenges of combinatorial auctions).


Can you talk about the regulatory issues that you faced and had to solve before launching?


We needed to work through the FINRA BD and SEC ATS-N registration process, and the latter is a requirement that went into effect well after we got started on OneChronos. We're pleased with how both went, and we chose US equities as a beachhead precisely because of how sophisticated the regulatory framework is. That said, it's quite the process, both time and resource-wise. We'll have to work through similar processes to pursue other asset classes and geographies. Doing so is core to our mission of making portfolio level transactions frictionless cross-asset and cross geography.

You can read our ATS-N here: https://www.sec.gov/Archives/edgar/data/1692652/000169265220...


Is there any way to track the volume of trades currently being processed by your exchange? It would be good to be able to track this number in order to see at what point it becomes viable to take a serious look into trading on your platform.


We're still going through integration and test/simulated trading with initial set of users so nothing public yet. But once we cut over to real cash/equities our volume data will all be published through FINRA's site[0].

[0]: https://otctransparency.finra.org/otctransparency/AtsData


How is your team organized? Is it US only or are there international/remote roles?


The company is US based but we're entirely open to remote work especially for engineering, and already have a handful of fully remote staff.


Sounds very interesting -- congrats on making to here, must be super exciting. A couple of questions:

How do you deal with best execution obligations for the "legs" of your trades?

What market data do you publish about your order book?


> How do you deal with best execution obligations for the "legs" of your trades?

Excellent question. Each leg of the trade comes in as an individual limit order with a price/quantity bound on it. This makes it easy for brokers to satisfy their 15c3-5 requirements since Expressive Bidding can only further restrict these limits. We clear the auction with a constraint that bounds the clearing region to the NBBO, snapshotted at the start of each auction. Rule 611 affords up to 1s if the market moves while we're optimizing, and away market movement has no average effect on execution quality, as auction start times are random and orders are firm.

In the future, we'll do ISO sweeps, but their usefulness is somewhat limited since getting fills from a sweep can change what's feasible for the auction. As we gather data on how folks use us, we'll hone on other regulatory options.

> What market data do you publish about your order book?

We report fills to the TRF to satisfy regulatory reporting requirements, but we're entirely dark otherwise.


Thanks for the info!


Question: If I just want to buy one stock, would I expect a better price from you or from a normal venue (both empirically/in reality on the one hand, and theoretically on the other)?


Anything involving the phrase "better price" is a regulatory sensitivity, so please forgive the specificity of this answer. It's possible but not guaranteed that both the buyer and seller can get a better price for micro and macro reasons. On the micro front, we're a uniform clearing price auction with prices out to the sixth decimal place (NB: this is not sub-penny pricing—we only accept orders with prices in increments of $0.01). That makes it possible for buyers and sellers to split the spread, with auction dynamics dictating the split. Also, Expressive Bidding allows market makers to provide liquidity on a hedged basis, potentially incentivizing them to post more aggressive sizes and prices.

On the macro front, we're focused on finding Pareto (more accurately Kaldor–Hicks) efficient outcomes missed by simpler auction formats. Imagine that a buyer expects to move the market 5bps while executing, and a seller expects the same. If agency issues prevent them from finding each other outside of OneChronos (perhaps they only have 2bps of statistical expectation) and they trade on OneChronos at the bid/mid/offer, the reader can decide if that constitutes getting a better price than other markets or not. Multiunit auctions make market impact/execution risk known pre-trade and reduce uncertainty for counterparties, potentially reducing the cost of liquidity.


I don't understand what you do but congratulations on not dying despite the apparent difficulties that have occurred thus far. Many would have given up.


Thanks! We're quite fortunate in that we always knew roughly where we needed to get to from a technical, business, and regulatory standpoint, so actually getting there was just a long hike on a mostly known path, one step at a time.


Gents, congrats on your launch from a friend in NYC! Have loved hearing about this despite understanding way less than you do about the markets, their problems, and your product. Glad to see you charging forward!


Thanks! It means a lot knowing people are out there rooting for us :)


Thank you!


why launching in 2022 a pre-2016 company?


Hey Kelly, Steve. (Wale here). Congrats on the launch!. This was a great read!


Thanks, and good to see you here! :)


I will consider myself an accomplished person the day I'm able to understand this post :)

No offence but tbh, when I read through this, I felt a deja vu of coming across another Theranos. Some super innovative sounding complex tech which ultimately turns to be a total dud.


We're simply standing on the shoulders of giants for this one. Paul Milgrom and Bob Wilson, winners of the 2020 Prize in Economic Sciences, are responsible for the underlying theory and commercialization in other industries. My advisor, Preston McAfee, introduced me to the concept (Milgrom's book was the auction theory text) and was largely responsible for bringing mechanism design to ad markets. Lots of folks are applying machine learning to accelerating discrete optimization problems, and an advisor of ours, Kevin Leyton-Brown [2], pioneered applying it to combinatorial auctions for wireless spectrum repacking.

Our value add is mainly a team that understands these fields and the extreme nuance of capital markets. That's allowed us to generate novel IP and a purpose-built solution.

[1] https://www.nobelprize.org/prizes/economic-sciences/2020/sum...

[2] https://arxiv.org/abs/1706.03304


Watch the video on their website. It will help you to understand. How useful it will prove to be is yet to be seen.




Applications are open for YC Winter 2024

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: