Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use of 0.02 default risk-free rate can surprise users #594

Open
Beliavsky opened this issue Apr 28, 2024 · 1 comment
Open

Use of 0.02 default risk-free rate can surprise users #594

Beliavsky opened this issue Apr 28, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@Beliavsky
Copy link

Beliavsky commented Apr 28, 2024

The output of

import numpy as np
import pandas as pd
from pypfopt.efficient_frontier import EfficientFrontier

np.set_printoptions(precision=3)
n = 2
mu = (np.arange(n) + 1)**2
print("mu:", mu)
cov = np.zeros(shape=[n, n])
for i in range(n):
    cov[i, i] = (1+i)**2
print("cov:\n", cov, sep="")
ef = EfficientFrontier(mu, cov)
weights = pd.Series(ef.max_sharpe()).to_numpy()
print("weights:", weights)

is

mu: [1 4]
cov:
[[1. 0.]
 [0. 4.]]
weights: [0.496 0.504]

When I compute the tangent portfolio using numpy directly and normalize the sum of the absolute values to 1, I get weights of

[0.5, 0.5]

With calculations being done in double precision, I am surprised that the round-off error from pypfopt is so large.

Thanks for the project.

@Beliavsky Beliavsky added the bug Something isn't working label Apr 28, 2024
@Beliavsky
Copy link
Author

Now I see that max_sharpe uses a default risk-free rate of 0.02. If I use

max_sharpe(risk_free_rate=0.0)

the problem goes away. I think the package should use a default risk-free rate of 0.0. Users could either set the risk-free rate themselves or pass expected excess returns.

@Beliavsky Beliavsky changed the title Large round-off error Use of 0.02 default risk-free rate can surprise users Apr 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant