DataScience.US
Your Source for Data Science

Predicting Success in lottery with Deep Learning

Deep learning is a particular field in Machine Learning that is driven by an abstract representation of reality.

815

The structure is close to those of the famous neural networks: the idea is to mimic the human brain, which is known to be very efficient in learning. A large number of layers with nonlinear processes between them are used: the deeper the network is, the more complex structures it can capture. The first machine learning algorithms appeared in the 1950’s and their development is clearly related to computational-power improvements.

About Deep learning

Deep Learning has proven its ability to solve many different problems from handwriting and speech recognition to computer vision. The structure of the algorithms is based on a reproduction of the human brain, which is known to be the most powerful engine. It is able to capture the latent structure in any dataset as a human being could and the results seem somehow magical for someone who is not familiar with this class of algorithms. The main purpose of this paper is to test its limits. After a great success in Go, the next step is simply to test whether deep learning is able to deal with randomness. It looks feasible because God does not play dice, Go is indeed a pure combinatorial problem and it may merely be reduced to a computational and optimization task. Randomness is conceptually more interesting and cannot be reduced to few dimensions: a higher dimensional model is required.

Lotto and its working

Lotto is a famous and widespread game involving randomness. The first lotteries are not clearly established and the Han dynasty used it to finance the construction of the Great Wall of China. The lottery principle is very easy: people buy a ticket which corresponds to a combination bet over a general set of numbers. A draw is eventually done at a fixed date and time. The gains are related to how the combination matches the draw and the jackpot is won if the combination is correct.

Predicting Lotto numbers is a supervised task: the collected data, in the present case based on the past draws, are used as inputs. The model is a neural network whose parameters are tuned according to the data during the training phase. Training is often difficult in neural networks, due to vanishing or exploding gradients. This is the main problem in these algorithms. At each pass over the data, the parameters are optimized and, after convergence, the validation set is used to compute the validation error.

Model and its representation

The features retained are firstly, at each draw time, the quarterly GDP, the quarterly unemployment rate, the American President (Obama or not), the day, the month and the year. To this, I added the number of times each number was drawn during all past draws and the cross presence matrix defined as the number of times every pair of numbers appeared together. For the number of times each number was drawn and the cross presence matrix, they were set to zero for the first draw and then were incremented at each step. The neural network implemented is represented figure below. I distinguished the cross-presence matrix and the other inputs. I applied convolutional layers to the cross-presence matrix. Then, using residual learning, I added the intermediate result to the output of the convolutional layers.

This is concatenated with all other features (quarterly GDP, unemployment rate, American president, day, month, year, and number of times every number was seen) and acts as an input to a first dense layer. A second dense layer leads to the final prediction. A non-linear sigmoid is used to predict the presence or not of the lotto number. For instance, on figure below, the 2 and 46 are two numbers (out of the six numbers) that are predicted given the input.

The output loss chosen was the categorical cross-entropy between predictions and targets. For 2 one data point k, the categorical cross-entropy is:

 Equation

With N being the number of categories (number of possible numbers, 75), pk, the target distribution and qk the prediction from the neural network. To obtain the overall categorical cross-entropy, I average over all data points. The optimizer used was Adam. I split the set of observations into a training set of 892 draws and a validation set of 315 draws.

Representation of a Deep Neural Network Model

Result

The results are plotted in the graphical below. The graph on the left is the error on the training set. To check for overfitting, I also calculated the error on the validation set. On both sets, the error goes down substantially, dividing the initial error by 5. This is the proof that it is a capturing an unidentified structure underlying the data. I would like to emphasize this point: even though the neural network in my brain can not identify the underlying structure of the data, the liberties given to the deep neural network give the possibility to learn a larger class of functions which explains how this model could capture an understanding of loto when the human brain can only interpret it, at best, as randomness. Moreover, the algorithm converges quickly after only a few iterations showing the efficiency of the neural network.

Result Analysis and Discussion

Graphs showing Training  and Validation error

Following the logic of the results, this leads to a new understanding of the concept of randomness. Where the human brain essentially understands randomness, a powerful model from the neural network framework captures a non-random structure. The human brain, as a physical system, has limits and the deep learning framework also. What all represented here is that the human brain limits are contained strictly within the deep learning limits which leads to whole new possibilities on our understanding of the world and to all the remaining unanswered questions.

Conclusion

For a large-scale proof of concept, I predicted the numbers that will be drawn on the 11th of April, these will be 1, 9, 13, 14, 63, and the mega number will be 7. And I can conclude on the existence of God.

 

Comments