An LLM’s Comparison of VecViz to Established Vol Models

Comparison can aid definition.

In other blog posts we have discussed the approach of VecViz and its Vector Model to price probability. Here we seek to further illuminate that approach through qualitative comparison to a few well-established volatility model archetypes that enjoy significant institutional money manager use.

Specifically, we seek to compare the Vector Model to popular deterministic and stochastic vol (i.e., volatility) models. Deterministic vol models view volatility as constant across price ranges and time horizons. For deterministic vol, we use “Sigma”, the normal distribution oriented constant volatility input to the Black-Scholes option pricing formula, depicted to the left in the picture above. Stochastic volatility methods view volatility as something that itself is variable. This dynamic is evident in the variability of option implied volatility varies across the term to expiration and moneyness continuums as illustrated in the right half of the picture above. For stochastic vol we use “Heston” and “Deep Learning of Rough Volatility”.

We ourselves cannot make an unbiased comparison, so we asked an independent large language model (LLM) to attempt one.

We attempted this comparison with a few LLM’s. Given the paucity of info on VecViz and the Vector Model on the internet at present there was a lot of variation with regard to VecViz and the Vector Model in the responses of the LLM’s. We present what we thought was the best LLM response in the table below, with attribution in the caption.

SigmaHestonDeep Learning of
Rough Volatility
VecViz’s Vector Model
InputsHistorical price data –Volatility estimatesHistorical price data – Volatility estimatesHistorical price data -Volatility estimates –
Neural network architecture
Historical price data –
Vector representations of asset prices
Key AssumptionsGeometric Brownian motion for asset prices – Constant volatilityGeometric Brownian motion for asset prices – Stochastic volatility (Heston process)Fractional Brownian motion for asset prices – Rough volatility (power-law decay of volatility)Vector autoregressive model for asset prices – No specific volatility assumptions
Parameters Requiring EstimationVolatility parameter (constant)Mean reversion speed – Volatility of volatility – Correlation between asset price and volatilityNeural network architecture – Hyperparameters for training the network (e.g., learning rate, layers)Vector autoregressive parameters – Vector representations of asset prices
Implementation Steps1. Estimate historical volatility. 2. Use it to price options.1. Simulate asset prices using Heston dynamics. 2. Price options.1. Train neural networks to approximate rough volatility. 2. Use the networks to price options.1. Estimate vector representations of asset prices. 2. Use them to model option prices.
Use CasesOption pricing – Risk management – Portfolio optimizationOption pricing – Risk management – Volatility modeling– Option pricing – Risk management – Volatility modeling– Option pricing – Risk management – Time series modeling
Pro’sSimplicity – Closed-form solutions for some cases – Widely used in financeCaptures volatility smile – More realistic than constant volatility models – Closed-form solutionsFast computation – Accurate approximation – Scalable to large datasets– Flexible representation – No explicit volatility assumptions – Captures dependencies between assets
Con’sAssumes constant volatility – Doesn’t capture volatility clustering – No jumpsComputationally expensive – Complex parameter calibration – No jumpsRequires large training datasets – Neural network complexity – Interpretability challengesRequires specialized techniques for vector representation – May not generalize well to all assets
The table above presents OpenAI’s ChatGPT via Microsoft Bing Co-Pilot, as of early May 2024, response to the prompt: “compare sigma, heston, rough volatility, deep learning of rough volatility, and VecViz’s vector model in a table that includes criteria for: inputs, assumptions, parameters requiring estimation, implementation steps, use cases, pros, cons”. Note that the response for “rough volatility” was dropped for formatting purposes, and strikethroughs were added to elements of the response I believe to be incorrect. I did not supplement any responses I believe to be incomplete.

We agree with many aspects of the LLM’s response, including:

  1. The Vector Model makes no explicit volatility assumptions
  2. The Vector Model does require specialized techniques, though they are visually transparent to a considerable extent.
  3. The Vector Model is not characterized as “simple”, “closed form” or “widely used”, though those terms are used to describe the other model archetypes.

Still, we would make several adjustments, including:

  1. We would include definitions of many of the terms of art the LLM included in the table. We attempt such definitions at the end of this post.
  2. We see several pro’s and cons attributed to other models that we would also attribute to the Vector Model, including
    • More realistic than constant volatility models (Pro’s)
    • Captures volatility smile (Pro’s)
    • Computationally expensive (Con’s)
    • Requires Large Training Sets (Con’s)
  3. Another “pro” I would add to the Vector Model is that neither the Vector Model nor option fair values based on the Vector Model require option market data, whereas stochastic volatility models typically do, for calibration purposes.
    • The Vector Model thus has an arguably different objective: it is trying to determine fair value based on its proprietary estimate of the ticker’s forward probability distribution, whereas stochastic volatility models are often used just to mimic the option market’s behavior for a ticker.
    • Considerable user judgement is required to set the parameters to a stochastic vol model such as Heston or Deep Learning of Rough Volatiltiy to “fair” levels so that in aggregate they produce a good estimate of an option’s fair value. In contrast, VecViz’s Vector Model makes all those judgements for the user and only presents the estimated fair value for the option.
  4. The final “pro” I would add would be the cognition friendly VecViz framework that the Vector Model is derived from.
  5. I would add the lack of external validation of VecViz’s back testing results as a “con” (remedying this is a priority).

Appendix: brief definitions of technical terms used in the LLM table

  1. “Geometric Brownian Motion” refers to the assumption that price returns are random and they compound, causing bigger price moves when price is high
  2. “Fractional Brownian Motion” refers to the assumption that price returns are not completely random – they exhibit some degree of momentum or mean reversion. 
  3. “Vector Representations” are lines connecting tops and bottoms and in the area spanning to another top or bottom, representing net vector balance trajectories, extrapolated forward to the model date. These lines are weighted according to their Vector Strength.

Leave a Comment

Your email address will not be published. Required fields are marked *