Disclaimer and Legal Information

This webpage contains notebooks related to machine learning, big data, neural networks and so on. This webpage has two main purposes:

- Easily locating codes I previously made in order to solve issues I may encounter while programming
- Share my work with others when needed

In this section, I store notebooks related to economic analysis and related topics such as commodities.

In this notebook, I implement HP Filters on US real GDP data from FRED in three ways:

- Using the Statsmodels library's implementation
- Solving the linear system of equation from partial derivatives
- Obtaining a numeric approximation using Scipy's minimize function

This notebook is a demonstration on how to use Jupyter to produce quick reports. Historical prices for WTI and Brent are directly downloaded from the EIA database and plotted.

In this section, I will include notebooks created during my reading of Francois Chollet's "Deep Learning with Python". This book is dedicated to the building of neural networks.

This notebook uses the techniques described in Chapter 5 of the book to vizualize the output of intermediate layers of a convnet. I used a picture of my parents' cat to showcase the results. The most interesting part of this notebook is the final output consisting of one picture for each layer, where the lighter area correspond to the most activated cells in the tensors resulting from each layer.

This notebook uses an alternative way to understand what the covnet considers when analyzing a picture. This techniques highlights the patterns to which each layers respond to. The pre-trained network used in this example is VGG16.

This notebook present a third way to graphically understand a neural network. In this case, I make VGG16 predict what animal is on a picture of my parents' dog. I then use the result to create a heatmap and graph it together with the picture to see the areas where VGG16 recognized a dog.

This section includes links to my public kernels on Kaggle.

I created this notebook in early 2018, approximately one year after starting using Machine Learning and Neural Networks. With this notebook, I wanted to apply what I was learning on a classic toy dataset, with less time spent on features engineering and more on model selection.

This notebook is one of my first step in data vizualisation. The resulting graphs mapping crimes in San Francisco are relatively interesting and I always keep a link to this kernel in case I would need to reuse part of the code to plot geographical data. Contrary to what the title of the kernel may suggest, I never went on fitting the model, except for two (one being a "null" model and the other using a simple Naive Bayes approach.

I am currently (July 2019) working on implementing the IMF's DSGE model. I may create a more generic class in order to quickly adapt my code to implement the ECB's Smets-Woulter model with minimal changes in the code.