My official betting thread

Remove this Banner Ad

Status
Not open for further replies.
NFL:

Taking some dome games now:

STL -1 @ 62%
TB +3.5 @ 60%
ARZ +6 @ 60%

HOU -9.5 @ 55%
 

Log in to remove this ad.

NYK -5.5 @ 55%


Brett, this is completely off topic but you might be able to help me with something, so I thought I'd ask.

In horse racing in HK, Bill Benter talked about how he would combine his estimated probability with the market's estimated probability, to come up with a final estimate. He'd do this by getting the log of each probability, multiply them by a weighting and then sum that up, get the exponential of that to return it back to a probability again. So, in a fictitious three horse race as follows, where the first probability listed is the punter, the second is the market, it would go like this:

probs
A: 0.4 0.6
B 0.25 0.1
C 0.35 0.3

log
A: -0.398 -0.222
B: -0.602 -1.0
C: -0.456 -0.523

let's assume a weighting of 0.7 for the punter, 1.4 for the market, as arbitrary figures

A:-0.279-0.311
B:-0.421-1.4
C:-0.319-0.732
Which, when you sum them and get the exponential of them, returns probs of -
A: .554
B: .162
C: .35
Which adds up to > 1.0 so adjusted back to 100% market =
A: 0.52
B: 0.152
C: 0.328
That's all fine, but the question I have is as regards the weightings used. The higher the weighting, the greater the impact of that model on the final probability. He said the best way to work out what the weightings should be, was to "maximise the log likelihood" of each model. That's the bit that's stumping me. What on earth does it mean to "maximise the log likelihood" of a model?
Appreciate any advice on this from others, too, or any pointers about where I might read up to work this one out.

Thanks
 
What on earth does it mean to "maximise the log likelihood" of a model?

It's optimization of the coefficients ('weightings' = coefficients) using Maximum Likelihood Estimation (MLE).

For optimization using MLE , you need need a decent sample set to train on (like any optimization). I would do this in SPSS, which is the most accessible to the non programmer, although SAS and R would do the same thing if you have experience in programming in their languages.


http://www.weibull.com/hotwire/issue33/relbasics33.htm <--- Probably the easiest article to understand on MLE
 
Bill Benter talked about how he would combine his estimated probability with the market's estimated probability, to come up with a final estimate

All he's doing is dampening down his own perceived edges by incorporating what the market is doing.
 
All he's doing is dampening down his own perceived edges by incorporating what the market is doing.

Yes that is true, but it means you can bet a higher Kelly fraction, and have less spikes and troughs as a result of your market being more indicative.

Thanks for the link etc and the suggested programs, I'll explore all that, appreciate it.
 
Brett, this is completely off topic but you might be able to help me with something, so I thought I'd ask.

In horse racing in HK, Bill Benter talked about how he would combine his estimated probability with the market's estimated probability, to come up with a final estimate. He'd do this by getting the log of each probability, multiply them by a weighting and then sum that up, get the exponential of that to return it back to a probability again. So, in a fictitious three horse race as follows, where the first probability listed is the punter, the second is the market, it would go like this:

probs
A: 0.4 0.6
B 0.25 0.1
C 0.35 0.3

log
A: -0.398 -0.222
B: -0.602 -1.0
C: -0.456 -0.523

let's assume a weighting of 0.7 for the punter, 1.4 for the market, as arbitrary figures

A:-0.279-0.311
B:-0.421-1.4
C:-0.319-0.732
Which, when you sum them and get the exponential of them, returns probs of -
A: .554
B: .162
C: .35
Which adds up to > 1.0 so adjusted back to 100% market =
A: 0.52
B: 0.152
C: 0.328
That's all fine, but the question I have is as regards the weightings used. The higher the weighting, the greater the impact of that model on the final probability. He said the best way to work out what the weightings should be, was to "maximise the log likelihood" of each model. That's the bit that's stumping me. What on earth does it mean to "maximise the log likelihood" of a model?
Appreciate any advice on this from others, too, or any pointers about where I might read up to work this one out.

Thanks

Why would you want to bring in your own assessment to coincide with the market?

My method:
1. Find +EV spot.
2. Bet up to maximum stake allowed or up to my subjective/arbitrary pain threshold
 

(Log in to remove this ad.)

NCAAB:

Dayton -16.5 @ 57%
Georgia Southern +17 @ 55%
Texas State +16 @ 55%
Canisius +20.5 @ 55%
Nebraska +13.5 @ 55%
East Carolina +16.5 @ 55%
 
What is the reasoning?

That two separately devised models, ie the market and the punter, can be combined to create a model whose predictive power is greater than either model individually. He would use a type of r2 test to measure the strength of each method, and found that when combined the r2 figure was at its highest.
 
NFL Week 15:

Already posted:

PHI +4.5 @ 55%
STL -1 @ 62%
TB +3.5 @ 60%
ARZ +6 @ 60%
HOU -9.5 @ 55%

The rest:

CHI +3 @ 60%
JAC +8 @ 55%
CAR +3 @ 55%
KC +3 @ 57%
NE -4 @ 62%
 
Hi Brett, have you used SPSS? I am having a read through their brochure and it certainly sounds like a very strong modelling tool.


Yeah I used it through out my course. It's fantastic for statistical insights, but not so good for model creation. It's not flexible enough to maintain a model within.

You can't go past hard code connected to an easily modifiable database. I mainly do everything in excel with some VBA thrown in, it's not as flexible as proper code, but I don't have flash programming skills.
 
Yeah I used it through out my course. It's fantastic for statistical insights, but not so good for model creation. It's not flexible enough to maintain a model within.

You can't go past hard code connected to an easily modifiable database. I mainly do everything in excel with some VBA thrown in, it's not as flexible as proper code, but I don't have flash programming skills.

I'm similar with Excel, use it a lot with a bit of VBA thrown in. As far as making a maximum log likelihood estimate, couldn't you just use the Solver in Excel to adjust co-efficients and return the highest r2 value as the goal?
 
Hmmm yeah that would work. I've done similar before, but I try to minimise mean or squared line error.

Goddamn NFL. I keep on winning the reverse lotto.

-7 on lost fumbles just this week. Despite the talking heads on TV saying that players have 'butter fingers' or 'have just lost confidence carrying', fumbles are almost completely random (about 90%). It's really unlucky to get a -20 fumble differential in a betting season, and I think I've smashed that.

-3 lost fumbles on the NE play alone. Teams that go -3 on lost fumbles do 14% ATS.
 
Status
Not open for further replies.

Remove this Banner Ad

Similar threads

Remove this Banner Ad

Back
Top