Introducing Zeitgeist’s “Rikiddo Scoring Rule”

Zeitgeist has researched numerous AMM models to find one that suits our needs. We have been revisiting the MSR AMMs, in particular the Logarithmic Market Scoring Rule (LMSR) invented by Robin Hanson through his prediction market research, and have developed a rule specifically for such markets.

Introducing Zeitgeist’s “Rikiddo Scoring Rule”

Introduction

Automated Market Makers (AMMs) make trading easier for users because they remove the need for a counter-party. Instead of trading peer-to-peer, AMMs trade peer-to-contract (i.e. A trader executes their trade not with another trader, but with an algorithm written as a smart contract).

AMMs continuously offer both buy and sell functionality at a price that is determined by a programmed cost function. Although legacy AMMs have been primarily based on market scoring rules (MSRs), adaptations in crypto such as constant-function have become more widely implemented.

At Zeitgeist, we have thoroughly researched numerous AMM models to find one that suits our needs. We have been revisiting the MSR AMMs, and in particular the Logarithmic Market Scoring Rule (LMSR) invented by Robin Hanson through his own research of prediction markets and economics. Since Hanson’s original proposal, several authors have come forth with variations of his formula and there is now a plethora of material available.

As Researcher in Economics at Zeitgeist, I have combed through all relevant literature, and with our amazing team have compared LMSR with more contemporary variations that are popular in the crypto space. Our work has resulted in a new variant of LMSR that we’re proposing in this paper, the “Liquidity Sensitive Dynamic LMSR". We’ve called this scoring rule the “Rikiddo Scoring Rule” after the Japanese word “Rikiddo”, which means “liquid”.

Throughout this paper, we will explore the different variants of market scoring rules, demonstrating the pros and cons as related to the context of our own use case within Zeitgeist’s protocol, and will then present our own model: The Rikiddo Scoring Rule, with its main characteristics and the arguments for why we believe it to be the most fitting scoring rule for our ecosystem. As we will see later, the main advantage of this rule is the ability to dynamically adapt its fee, using endogenous variables.

LMSR: Back to Basics


When using a prediction market, market makers need to make two fundamental decisions about that market:

  • The choice of the contract type.
  • The market mechanism.

The market mechanism is the focus of this specific work, so we'll keep the contract type for another blog post.

The market mechanism defines how the trades of the market participants can be conducted. Robin Hanson (2007) [1] proposes a Market Scoring Rule, or a "sequentially shared scoring rule", that requires market participants to offer successive predictions, and this rule acts like an Automated Market Maker, continuously offering to buy and sell contracts at the price determined by the price function.

Among several scoring rules used in prediction markets, the Logarithmic Market Scoring Rule (LMSR) is the most popular. This Scoring Rule financially incentivizes market participants to give truthful opinions about future events.

The LMSR’s cost and price functions are functions of the net amount of the shares of each outcome sold so far. Having a q vector of the number of shares on the market for each outcome, the cost function is given as:

\[ C(q) = b\ln(\sum_{i=1}^{n} e^\tfrac{q_{j}}{b}) \]

If we partially derive C(q) we will get the price function for each asset. The b parameter can be described as the “liquidity parameter”, and is manually set before the market is open. Authors like Lekwijit et al. (2019) [2] showed that this parameter influences not only the liquidity but also the maximum loss that the market maker will incur and the price adaptability.

For example: a smaller b represents low market liquidity. This implies that the market price will fluctuate considerably, and we’ll have a more adaptive market. Consequently, this translates into a smaller maximum loss for the market maker. Why?

The market maker's  worst-case loss happens when traders are very certain of the outcome and eventually change the market probability of the realized outcome to 1. If the market starts with a uniform probability distribution over an M number of outcomes, the market maker's worst-case loss is \(  b\ln M \). So, if the market maker wants to operate the market at a minimum loss, he has to set the value of b as small as possible.

Think of it as follows:

  • At all times, the market signals its true value.
  • These signals are sent to traders, who will interpret them and carry out an action (buy or sell) related to this signal.
  • If market signals are clear and traders act accordingly, the true market outcome would be known and the resulting market probabilities would change to 1.

With a highly liquid market (high b), it would take less steps for the market price to reach its real value, since the successive variations of the quantities would occur in a gradual and controlled manner. In a market with low liquidity (low b), the changes in the quantities would occur more abruptly, producing a longer oscillation around the real value, providing less certainty about the true outcome. The more volatile the market is, the less certainty traders have about the probability of realization of the outcome.

It is thus safe to say that, in our case, a parameter with so much prominence over the regulation of market behavior must be subject to the state of the market and not be an external fixed value that depends on the interpretation of their behavior signals.

LS-LMSR: A New Hope

An interesting approach to raise the liquidity parameter is the one used by Othman and Sandholm (2010) [3] where they indicate that price rules cannot have path-independence, non-arbitrage, and be liquidity-sensitive at the same time.

Figure 1: Representation of the characteristics of an MSR. The three requirements represent a trilemma, and we can only have two of the three.

Following that statement, Othman and Sandholm propose a liquidity coefficient that depends on an alpha coefficient and the size of the market (the sum of the number of contracts traded in the market).

\[ C(q) = b(q)\ln(\sum_{i=1}^{n} e^\tfrac{q_{i}}{b(q)}) \]

\[b(q)=\alpha\sum_{i=1}^{n} q_{i} \]

𝛼 being a fixed parameter. However, the authors pointed out that this brings an intrinsic problem: this mechanism adapts very slowly to the underlying value if the jump happens after a sizeable number of transactions have already occurred.

Consider: The main cost of this model, according to the authors, is seeding the funds for the initial state. At the same time, market operators don’t use any external information or past price information to arrive at a decision. Also, the previous model was criticized for having a fixed parameter, and in this model we add a new one, but it is multiplied by another fixed parameter1 .

Under the use of this rule, we would continue to face the problem of being unable to capture the market’s ever-changing signals and adapt to them, perpetually changing the parameters manually. This could be an option for project cases where we work with a reduced number of markets simultaneously. This is why it is necessary to continue with the search for a rule that allows these changes to be adjusted automatically2 and dynamically, having the ability to adapt to these variations.

The Dynamic Market Maker

Thus far we’ve read about models who statically establish their fee. They perceive a signal that the market has changed in some way, and the AMM adjusts its behavior (parameters) accordingly. This is inefficient for several reasons, but the two main reasons are:

  1. The changes of the parameters are discrete. They change only if the algorithm can perceive a significant change in the market structure.
  2. They propose an ex-post adjustment of the market maker. This means that the parameters will only update after a change (price pump, information shock, etc.) happens, so the adjustments are for future outcomes, but they won’t catch the initial shock.

With those reasons in mind, we look to a Dynamic Market Maker (DMM) proposed by Nguyen, Luu, and Ng (2021) [4]: A new method to add dynamic fees to the market maker. First, they calculate the correlation between the prices of the two assets of the pool, giving an indication of how similar they are to each other. After this, a list of parameters are provided according to the correlation that they have (taking into account that a lower correlation means higher discrepancies). One of the parameters provided is an initial fee for the asset. Additionally, a dynamic fee is calculated, this depends on some variable factors like price or volume. The authors use a ratio between the volume of a short period and a large period, which makes sense considering it is on-chain data.

With this model, you can give perpetual variations of the fees, and the crux is that it discourages trades in periods of high volatility (when traders tend to make market profits due to price discrepancies) by increasing the fee (and consequently the bid/ask spread) and encourages trades in low volatility periods by decreasing the fee.

Checkpoint

We've now explored three models each with strong characteristics that we'd like to see in an MSR:

  • A strong and reliable structure (LMSR).
  • The ability to adapt to liquidity changes (LS-LMSR).
  • The ability to adjust the rate to be able to incentivize or discourage trades according to the market conditions (DMM).

Based on these three characteristics, we have developed our own MSR: The Liquidity Sensitive Dynamic LMSR, or “Rikiddo”.

The Rikiddo Scoring Rule: The Force Awakens

Before we define our model, we need to understand liquidity, volatility and their relation to each other. When talking about the increase of liquidity in a pool, we directly relate to this with an increase in the amount of assets that are within it. We’ve observed in the cited papers that the amount of assets in a pool is generally referred to as a stock variable, but the flows that make it vary are not considered.

So what do we mean? Let’s illustrate: If we have a pool with water and we want to fill it, we are only taking into account the water level that it contains at a certain period, but we do not take into account the flow rate with which it is filling, nor are we considering the possibility of leaks that cause water loss. Continuing with the analogy; the existing leaks tend to increase as the filling flow and the pool increase in volume, so the market maker should reduce the possibility of leaks by imposing higher fees on flows that vary the level of the pool (inputs and outputs).

Thus, in a more technical sense, the volatility is related to the flows, but the liquidity is the result of its difference (stock).

Another thing to keep in mind is the possibility for traders to take advantage of flow irregularities (arbitrage opportunities). This generates profits for more experienced traders, but causes losses for liquidity providers. That’s why it is important to have a market scoring rule that regulates the flows.

Now, let’s continue with our model…

This model starts from the LS LMSR with the following cost function:

\[ C(q) = b(q)\ln(\sum_{i=1}^{n} e^\tfrac{q_{i}}{b(q)}) \]

\[ b(q)=\alpha\sum_{i=1}^{n} q_{i} \]

Like we’ve noted before, this model incorporates an 𝛼 value that can be interpreted as a commission taken by the market maker. The larger the 𝛼, the bigger the commission. Having this parameter as a fixed value implies that shocks on prices or changes of structure (for example, information disclosed that alters market perception, affecting a particular asset, increasing its traded volume) will not be considered in this model at least once this issue occurs, and then the market maker has to adjust its parameters. For this reason, we propose a parameter composed of two factors:

\[ \alpha = f + \eta(r) \]

Where f is an initial fixed fee, and is a function that represents the variable of the fee that depends on an r parameter. Let’s get into these two a bit more…

These values follow the properties of the dynamic market maker, where f will depend on the pre-existing relationship between the assets that will be located in the pool: The higher the correlation, the lower this fee. The structure of f is similar to the structure of that originally from LS-LMSR:

\[ f= \frac{vig}{n \log(n)} \]

Where n is the number of assets inside the pool, and vig (vigorish) is the coefficient that changes according to the correlation between assets, representing the fee charged by a bookmaker for accepting a gambler's wager. For different assets and correlation coefficients, we’ve managed to make a table of vig values that worked for us3:


Number of assets

Correlation Level

Low

Mid

High

2

0.04

0.02

0.012

3

0.09

0.047

0.028

4

0.16

0.08

0.048

5

0.23

0.116

0.069

On the other hand, η is a function composed of r which is the ratio between the behavior of the volume in an interval of t periods versus another interval of t+k periods. In our case we are going to use the Exponential Moving Average, but the Weighted Moving Average (or Simple Moving Average) can be an option too.

\[ r= \frac{EMA_{t}}{EMA_{t+k}} \]

\[ k>0 \]

Also, η(𝑟) accomplishes the following condition:

\[ \frac{\partial \eta}{\partial r}\geqslant 0 \]

The function can take any form that meets the preconditions. The important thing to consider about the function type is that it will regulate the behavior of the dynamic fee according to how the ratio between the moving average varies. We will use a Sigmoid function:

\[ η(𝑟) = \frac{0.01r}{\sqrt{6+r^{2}}} \]

This implies that η will have a more pronounced slope at the beginning than for later r values. The intention with this is to generate incentives to decrease the variation of the volume at early stages, so we can control it in a smaller bound4. Also, initial values can either be positive (if r>0) or negative (if r<0), meaning that the total fee can be smaller than f. This value will condition the initial fee to create incentives to trade in periods of low volatility, and discourage trading in high volatility periods.

Given the fact that we can have fees lower than f, it is necessary to introduce another new concept: The minimal revenue coefficient (ω), given by:

\[  ω= \beta f \]

\[  0<\beta \leq 1 \]

\[  \omega \leq \alpha \]

This value is a proportion of f, and ensures that the profit to the Liquidity Provider will be positive. Having said that, we could re-express the b parameter as:

\[ b=\max (\alpha, \omega )\sum_{i=1}^{n} q_{i} \]

So, the cost function is given by:

\[ C(q)=\max (\alpha, \omega )\sum _{i=1}^{n}q_{i}\ln \sum_{i=1}^{n}(e^{\frac{q_{i}}{\max (\alpha, \omega )\sum_{i=1}^{n} q_{i}}}) \]

The difference between the cost function of the new state (q+Δq) and the previous one (q) represents the cost of the transaction.

We have our model at this point, but we need to add an important feature: The AMM behavior when the market is running out of liquidity. In this case, we will significantly raise the fee in such a way as to discourage any possible type of trade related to that asset. This raised coefficient is an ad-hoc value for now.

Having all these parameters defined, we can finally say that none of them are exogenous, and thus we can define our values with on-chain data. We consider it to be an optimal improvement, since now the parameters and limits of the model can be analyzed and validated with data from each specific case in a dynamic way. This, however, does not ignore the fact that in initial instances, it is necessary to impose a baseline value, so that it can later be worked on and altered according to the needs proposed by each specific market.

Conclusion

This paper has been drawn up and laid out based on my research for the purpose of presenting numerous versions of market scoring rules. As we have identified, outlined, and even tested these varying iterations of these rules, we have been able to develop a scoring rule of our own, one that we have called “The Rikiddo Scoring Rule”.

In summary of what we have outlined above, our newly developed Rikiddo Scoring Rule has three primary characteristics:

1) Its rate is dynamically adjusted, allowing the market to encourage or discourage trades as necessary;

2) It is not necessary to modify the parameters after sudden market signals as is the case of rules where the parameters are fixed (the exponential moving average is responsible for its regulation);

3) All the parameters are endogenous, and therefore their determination and validation will depend on on-chain data only.

Although at Zeitgeist we believe that this Market Scoring Rule is already significant progress, we will continue to innovate in order to obtain optimal functions that adapt dynamically to recurring circumstances that present themselves within our markets.

Future lines of work

For the near future, we have established some lines of work in which we will continue to focus on in order to improve this newly developed Rikiddo Scoring Rule:

  • Apply the use of relative volume fees instead of absolute volume fees: Currently, our dynamic fee is calculated on the total volume within the liquidity pool, but the ideal situation would be to consider the variation in the individual volume of assets, which would lead to different dynamic fees per asset. This improvement brings with it the computational cost of doing additional calculations.
  • The use of Rikiddo in combinatorial markets: For the current work, we’ve used the example of a binary prediction market, but a great utility of prediction markets is combinatorics, where the predictions are defined in a combinatorial space.

We hope this paper has been helpful in your understanding of our Scoring Rule development and implementation, and we welcome any feedback you may have via email on “hi@zeitgeist.pm” - we would love to hear from you.

Thanks for reading and for your interest in our work at Zeitgeist in building an effective prediction market protocol.

Footnotes

\(^1\) We are aware that authors say that ɑ is the parameter responsible for establishing the “vig” value, but we need to establish some criteria behind that value.

\(^2\) With 'automatically' we refer to the ability for parameters to vary without manual modification.

\(^3\)  These values could change over time.

\(^4\) The main idea is to reach liquidity in a controlled way, avoiding high volatility.

Bibliography and complementary articles

[1] Hanson, R. (2007). Logarithmic markets scoring rules for modular combinatorial information aggregation.  http://mason.gmu.edu/~rhanson/mktscore.pdf

[2] Lekwijit, S., & Sutivong, D. (2018). Optimizing the liquidity parameter of logarithmic market scoring rules prediction markets. Journal of Modelling in Management.

[3] Othman, A., Pennock, D. M., Reeves, D. M., & Sandholm, T. (2013). A practical liquidity-sensitive automated market maker. http://www.eecs.harvard.edu/cs286r/courses/fall12/papers/OPRS10.pdf

[4] Nguyen, A., Luu, L., Ng, M. (2021). Dynamic Automated Market Maker. https://files.kyber.network/DMM-Feb21.pdf

Brahma, A., Chakraborty, M., Das, S., Lavoie, A., & Magdon-Ismail, M. (2012, June). A Bayesian market maker. http://www.eecs.harvard.edu/cs286r/courses/fall12/papers/bmm-ec.pdf

Othman, A., & Sandholm, T. (2012, June). Profit-charging market makers with bounded loss, vanishing bid/ask spreads, and unlimited market depth. https://www.cs.cmu.edu/~sandholm/profitChargingMarketMaker.ec12.pdf

Wang, Y. (2020). Automated market makers for decentralized finance (defi). https://arxiv.org/abs/2009.01676