-
Notifications
You must be signed in to change notification settings - Fork 322
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CPS-???? | Mutable shared state and atomicity are compatible #874
base: master
Are you sure you want to change the base?
CPS-???? | Mutable shared state and atomicity are compatible #874
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@klntsky from my level this seems like it has material in common with Intents / Validation Zones and the current discussion behind the latter CIP (please correct me if I'm wrong): #862
@fallen-icarus maybe you could look at this from the other direction & let us know if this CPS relates to indefinite / incompletely specified transactions as per your concurrent posting: #873
I've added it for introduction at tomorrow's regular CIP meeting (where we can brainstorm a more concise title): https://hackmd.io/@cip-editors/94
I'm sorry if I sound harsh, but I don't believe this CPS should be discussed.
I'm not sure I can think of a use case where non-determinism is required in principle. Current designs that fall back to such "reinvented non-determinism" are the result of a previously immature ecosystem (plutus v1 + lack of specific knowledge on eUTxO), having to "borrow" non deterministic designs coming from other ecosystems. But in principle, non determinism is NOT a requirement. Yes determinism might need us to think a bit harder because there were no established patterns, but as the ecosystem is evolving we are coming up with more and more eutxo-friendly deterministic patterns. And in doing so we are preserving a silver bullet in terms of security of the protocols running on this deterministic design. Security that other ecosystems can not achieve this easily.
Again, this is mostly true only because so far the "established" designs tend to "borrow" designs NOT INTENDED for utxo. Paying the price of applying a synchronous model, where instead we should be thinking in a parallel model. Many examples of contracts being able to handle concurrent users are present, the most naive I can think of is the NFT marketplace contract. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have to agree with @michele-nuzzi (although I'm still open to discussing it).
I'm not sure I can think of a use case where non-determinism is required in principle.
I've built a DEX, lending/borrowing protocol, options trading protocol, and aftermarket protocol. This is a full DeFi stack, and not once did I wish for non-determinism.
I think all of the problems you've mentioned are actually due to most DApps developers still not knowing how to properly use the eUTxO model. This isn't something that can be figured out in a year or two. I literally just opened a CIP that shows the eUTxO model actually enables securely breaking apart transactions; doing so would trivially enable babel fees and high-frequency off-chain trading. I didn't realize it was possible until only recently. Perhaps this CIP can actually help solve your batcher problems?
Every DApp developer I've seen argue "we need non-determinism" is also doing something that I think could be better done another way. Is the problem that we don't have non-determinism? Or is the problem they are misusing the eUTxO model? My experience and understanding make mean lean very heavily towards the latter.
Personally, I wouldn't consider sacrificing any determinism for another few years. I really don't think it has been given a fair chance yet. If we can figure it out, determinism has way better security guarantees than non-determinism.
CPS-XXXX/README.md
Outdated
However, it comes at a cost: | ||
|
||
- Development becomes more complicated because of the need to "re-invent" non-determinism where it is required for the dApp logic | ||
- UTxO contention limits the number of concurrent users a dApp can have |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is not a universal truth. This is only true for concentrated DApps (eg, liquidity pool based DApps) which are not taking advantage of the eUTxO model. Each seller can have their own UTxO which makes concurrency as high as the underlying market (trying to increase beyond this can lead to economic instability).
I haven't touched this CIP in a while, but distributed DApps don't have this problem. Distributed DApps may have a lower throughput per transaction, but since these transactions can be submitted in parallel, they can easily have a higher throughput per block than concentrated DApps. (I'm not a software engineer so there is likely still a lot of room for improvement with distributed DApp throughput.)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Each seller can have their own UTxO which makes concurrency as high as the underlying market
The problem is that each buyer will try to use the order with the best price available. No matter how many UTxOs you create in an order book, it will be a probabilistic game without batchers, but it doesn't have to be in principle. By parallelizing you can only sidestep the problem, not solve it, which is not really viable, as EVM-based AMMs can just provide better guarantees: they can offer immediate execution no matter the number of users. Aiming at something less than that means losing, because the quality standards are not being set by you
CPS-XXXX/README.md
Outdated
|
||
## Use cases | ||
|
||
### Automated market maker DEX |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am against AMM DEXs as the dominant DEX architecture:
- They directly undermine the security assumptions of Proof-of-Stake since users are forced to give up most, if not all, delegation control, voting control, and custody of their assets
- They do not allow users to specify their own prices which makes them extremely economically inefficient
- They are fundamentally centralized since updating/maintaining the liquidity pools is controlled by nothing more than a multisig; governance actions are not trustlessly enforced
Using AMMs as the foundation of a DeFi economy (especially one secured by, and governed by PoS) is a really bad idea. I honestly struggle to even call this an opinion due to how strongly I believe this.
The problem is not the lack of immediate execution. We actually don't need immediate execution; we only need "fast enough" execution which future improvements to Cardano are likely to provide. IMO the main reason Cardano's DeFi has not really taken off is because most DApps are still using liquidity pools for everything and a significant number of people do not like the downsides I mentioned above. They are still waiting on the sidelines.
(Apologies, but this topic really triggers me 😅)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
They directly undermine the security assumptions of Proof-of-Stake since users are forced to give up most, if not all, delegation control, voting control, and custody of their assets
It's possible to use own staking credentials with dApp's script as payment credential
They do not allow users to specify their own prices which makes them extremely economically inefficient
Existing AMM DEXes allow placing orders as a side-feature. An order book existing alongside an AMM is how uniswap does that.
They are fundamentally centralized since updating/maintaining the liquidity pools is controlled by nothing more than a multisig; governance actions are not trustlessly enforced
Any governance scheme can be attached to any protocol.
Liquidity pools without an update mechanism are possible. An order book that lets the admins do something with the orders is also possible.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Anyway, AMM DEX is just an example here. An order book suffers from the same problem: an order can be matched only by one counter-party.
That CIP does not address UTxO contention, because even though Txs can be assembled piece-wise independently, the UTxOs they ought to consume in the end can only be consumed by a single transaction. There can only be as many swaps as there are UTxOs, while mutable shared state would allow for as many swaps at the same time as the settlement layer allows |
The ledger changes since Plutus v1 did not let us use batchers any less. Do you have a counter-example?
Type-II non-determinism is in the product requirements of the core DeFi primitives: AMM DEXes, lending/borrowing, liquidations, etc. These are the cases where the user shouldn't and can't possibly know the outcome of his action within a dApp. |
What particular aspects of security do you have in mind? I can show an example of how determinism affects the security in quite a catastrophic way (this will be the topic of my next CPS). The DAO hack mentioned in the eUTxO paper does not count, it's obvious that mutable shared state was not the culprit, it was their particular API design choices around it.
The fact that the ledger design is good enough for something does not mean that it is good enough for everything in that category. Your argument can be rephrased as "immediate execution in the presence of dApp-layer non-determinism is not a valid use case", right? |
- introduce an excerpt from the paper - add a note about collateral loss to the AMM DEX example
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@klntsky I've retitled the PR based on the words I recalled people using in the debate of this issue in the last hour's meeting. It's vital that we keep CIP titles not only concise (the original title would have been the longest) but free from bias: especially since the confining effect on Cardano DeFi has not been objectively established yet.
As I said at the meeting I'm happy to see this discussion as a counterpoint to the proposed design patterns that would offer transaction flexibility:
... without sacrificing Cardano's "unique selling proposition" of determinism. I would recommend promoting this to a CIP candidate if & when a use is documented that cannot be satisfied by a fully deterministic design pattern.
|
||
## Problem | ||
|
||
It is impossible to build a dApp that has these three properties at the same time on a fully-deterministic ledger: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the key to scaling solutions on Cardano without compromising its security and determinism lies in building layered solutions. Bridges and other scaling tech can help us batch transactions, reduce load on the main chain, and improve overall performance without giving up on the ledger determinism. In my view this is not a problem, but a feature of Cardano, and we need to find ways and adapt our solutions with that goal.
cc @lehins |
In general I agree that our ecosystem would benefit greatly from some global state mechanism, after all the whole original pitch of Cardano was to be a hybrid ledger (this is why we already have global state in the form of our staking / rewards system). I just think the suggested approach in this CIP is wrong. Instead, we should just expand the existing account system (reward accounts) with the introduction of accumulators which provide the ability to store and manipulate data encoded types. Then we can simply expand the existing interface that we have for interacting with the account portion of our ledger i.e. we have
We can add:
Where
Then actions can be processed on the accumulator via:
Where The last piece required is I think the above covers everything this CIP seeks to achieve without introducing additional complexity to the UTxO portion of the ledger (we isolate this non-deterministic computation into the already non-deterministic account portion of the ledger). Likewise, we can use the The detail is in the implementation. In case of Cardano that state is explicitly passed around from one transaction to another, while Ethereum actually does the global mutation.
while Etherium is
One powerful property is that it enables a large number of expensive operations to be done entirely offchain. For instance, linear search should never be performed in onchain code under any circumstance ever because it completely throws away what is perhaps the biggest advantage that Cardano's smart contract platform has over those in other ecosystems, namely the deterministic script evaluation property. We made huge design sacrifices to obtain this property so to not take advantage of it frankly would be like running a marathon with ankle weights. By taking advantage of this deterministic script evaluation property, we never have to perform search onchain because we know what the list looks like at the time of transaction construction we can just pass in the index where the item we are looking for should be and fail if it is not there. We can lookup anything in O(1) checks/boolean-conditions by providing the onchain code with the index (via redeemer) to where the element you want to find is supposed to be and just erroring if the indexed element is not indeed what you are looking for. The fact any element can be found onchain without linear-search is an extremely powerful property of our smart contract platform that simply doesn't exist outside our ecosystem. This doesn't just apply to search. For example one demonstration of the strength of this is that for any problem in NP (Nondeterministic Polynomial Time) we can solve it in P onchain by calculating the solution offchain and providing it via the redeemer, and then simply verifying the correctness onchain. This is only possible because of the deterministic script evaluation property which guarantees that all inputs to a script are known at tx construction. This is not possible in systems with non-deterministic script evaluation because the solution that they provide during tx construction may be invalidated by the time their transaction is processed in a block. Another obvious benefit is the application of zero-knowledge proofs especially zk scaling. For instance, currently we can transform any smart contract
Yes, in-fact we already have a form of mutable shared state. See global-state (although this global state will only be meaningful to Plutus scripts when Conway II is released which allows for staking scripts which require script execution for registration, in Conway I the execution of the associated staking script is optional for registration).
In the model proposed above, it would be handled the same way that we currently handle mutable shared state (reward account registration / deregistration). |
What about how this impacts Ouroboros Leios? According to the original paper, it depends on the assumption that the majority of transactions are independent of each other.
It also requires that checking for conflicts must be possible without having to run the actual smart contracts:
Section 3.2 of the paper concludes with this:
While technically the rewards are a form of global state, they do not create any dependencies between transactions since the only possible thing you can do is withdraw the balance. Currently, smart contracts can't even check the available balance. AFAIU It sounds to me like the changes being proposed would make it possible for a large number of transactions to become dependent on each other in the sense that the order of execution now matters more and requires running the smart contracts to determine if the specified order is actually valid. In other words, it would result in the assumptions underlying Leios being consistently violated. |
@fallen-icarus This isn't quite correct. In addition to withdrawals, registration and deregistration also act on global state, and unlike withdrawals they can indeed create dependencies between transactions. For instance, consider the following:
The success or failure of the Sure, smart contract execution isn't needed to determine validity, but smart contract execution itself isn't the issue that Leios is concerned with, instead the important part is: |
@colll78 The even or odd example seems too contrived to be a real use case. Why would a dex care whether the total number of transactions in a block is even or odd? Are there any DApps that actually care about this? Still, perhaps my wording could have been better since I concede it is technically possible to create dependencies using certificates right now. However, I don't think my conclusion is wrong since doing so doesn't seem useful (isn't the lack of usefulness the point of this CPS?). Even if there are some use cases for it, I think the overwhelming majority of transactions will not bother using certificates like this due to it being useless in most contexts. So IMO the assumption that most transactions will not depend on each other is still a safe assumption right now.
You definitely know way more about software engineering than myself, but AFAIU I am skeptical that restricting the ex-units is enough. Currently, isn't it enough to just check if the input still exists which is 0(1)? There is no need to do anything else aside from lookup the input in the UTxO set (I am assuming this is a map). But if you now need to check the accumulator smart contracts, you also need to deserialize the smart contracts before you can even run them. Wasn't this the problem with the reference script ddos you thwarted (by de-registering the staking credentials)? I think someone from the consensus team should weigh in on this, but I'm not sure who to tag. I don't mean to come across as universally against mutable state; I just want to make sure including it on L1 doesn't sacrifice anything meaningful. For you, the pull to cardano may have been the promise that it would be a "hybrid ledger", but I actually don't know why you think that was promised. The eUTxO paper quoted in this CPS literally says it "forgoes any notion of shared mutable state" due to it creating difficulties when trying to scale the blockchain. One of the main reasons I came to Cardano was precisely because I thought there would be zero global state for 1) the scaling issues and 2) I don't think it is necessary for layer 1. Even TradFi has global state on layer 2 (ie, cash+coins are layer 1 and the banking system is layer 2). |
I agree the example I gave above is indeed contrived, purposely so, since it is learning material that I created to illustrate how the design pattern works. There are indeed practical applications of this design pattern, a single credential is essentially global state over a single bit. Thus the above example manages only a single bit, you can however, extend it to manage multiple account credentials which together form a sequence of bits of global state which can be used for more complicated examples. For instance, typically the maximum signers on a multi-sig in a single block is limited by the constraints of a single transaction. If you use this design pattern, you can create a multi-sig contract that is parameterized by a set of credentials and succeeds only if all credentials are registered, each credential script is parameterized by its own set of signers, then for each credential script registration succeeds if and only if a majority of its signers signatures are present then you have a very large multi-sig that can check for a majority signature of a large set of participants in a single block. You can similarly extend the odd, even example (modeling it as a bit sequence) to keep track of the total number of orders that were placed a the block, which you can then use to for instance adjust a lending rate or adjust protocol fees based on utilization. Also keep in mind that this isn’t exactly “shared mutable state”, instead that state is explicitly passed around from one transaction to another. As for the concerns regarding scripts during phase-1 validation, O(1) just means constant upper bound time-complexity. Both executing a Plutus script and looking up an input in the UTxO set are O(1), since the running time of Plutus scripts are constrained by ex-units. The difference is the size of the constants. We already have a small efficient scripting language with low enough constants to fit in phase-1 validation that is native scripts! Deserialization and serialization are done to improve storage efficiency and preserve network bandwidth, if we impose very strict constraints on size and ex-units of these accumulator scripts I don’t see why they would need a compact representation, the storage cost already takes into consideration size in bytes, instead we can just store them directly in flat UPLC encoding. |
Unless you are the stake pool operator that is creating the next block, you don't have control over which transactions will be added to the block. It seems very likely for some transactions to be added to the block while others are omitted which means the multisig will likely fail even if it should succeed. Furthermore, because you don't have control over which transactions are added to the block, this approach is susceptible to a man-in-the-middle attack where the stake pool can deliberately omit certain transactions from the block to control the outcome of the multisig. Unless I am misunderstanding something, I don't see how this approach is practical at all. Besides, you can already get around the constraints of a single transaction by using native assets instead of registration statuses. For example, you can group signatures into sub-transactions and mint a symbolic "Yes" token if the sub-transaction's multisig succeeds (the minting would be governed by a separate script to ensure all sub-transactions can mint the same "Yes" token). Then, you take a top-level transaction that spends each of the UTxOs with the "Yes" tokens from the sub-transactions, and burns the "Yes" tokens. If there is the minimum required number of "Yes" tokens being burned, the overall multisig would succeed. This approach is very similar to the registration approach except no global state is required at all, and the sub-transactions and top-level transaction can appear in separate blocks. This approach is also easily scalable since native assets are small enough to easily have 10s of thousands of "Yes" tokens in a single transaction (the tokens can be consolidated into a single UTxO before executing the top-level transaction).
From an economic perspective, I think it is a really bad idea to base prices off of the total number of orders processed in a previous block. Prices need to be based off of the current ratio of supply and demand. If a block had a sudden burst of 100 orders, using this approach, the prices should increase. But what if there are only 5 orders remaining to be filled (ie, there is very little demand left to fill)? This is actually the time when the prices should decrease since the supply now dwarfs the demand. So now you are incentivizing fewer orders at the exact time the DApp actually wants more orders. As another example, lets say there is a liquidity source that gets drained by 90% by one order in the previous block. It was only one order so the prices should decrease. But there is only 10% of liquidity remaining which means the prices should actually increase to disincentivize new orders and incentivize new liquidity. The total number of orders processed isn't relevant for determining prices. And again, since you don't have control over what transactions actually go into the block, this use case can possibly be gamed by stake pool operators (eg, deliberately minimizing the number of orders per block to keep prices down). I don't see how this doesn't lead to economic instability and market distortions, so I don't consider this a practical use case either. EDIT: I think it is worth pointing out that oracles can be used if a DApp really does care about the number of orders processed in a previous block. So in both use cases mentioned, the same niche can already be satisfied without using any kind of global state. |
Full determinism of transactions, defined by the absence of shared mutable state, gives rise to valuable ledger properties, such as:
However, it comes at a cost:
Completely avoiding mutable shared state may be a suboptimal way to achieve the desirable properties, as the design space for alternatives has not been fully explored.
(Rendered version)