Disclaimer: This is just my experience on the Fulbright process. This might vary between specialties, countries and backgrounds. Good luck!
Lately I’ve been having a debate with myself. In a lot of cases, I feel that I am pretty intentional with the things that I have in my life. In areas like friendships, work or family, I just have what I need. However, and because I have taken up a few hobbies and I had a bigger budget, I have been buying more material stuff. I do not like this, and I would like to change that aspect of my life, or at least improve it. There are multiple books about minimalism, and I have read quite a few of them:
Sometimes our data is not as clean as it should be. Maybe some images are corruped, or the entries inside the database were not as expected. In that case, iterating through them using the Pytorch dataloader can be hustle. Recently I learned that you can clean your data after it has been loaded, but before you are iterating throught in your training lops. In this case I am going to give an example with images, but it can be applied to any kind of data.
I was having problems with missing information in my problem. There were missing images which were read as NaNs image shape matrixes but at the end they were fed to the CNN.
I am working with a dataset that contains lots of missing values. However, when a value is missed it is not represented as NaN but as “–”. In order to find how many missing values there are in an specific column I cannot use typical Pandas functions. Henceforth, I have worked it out by count the number of appereances of this character in the column and substract it to the total number of entries in the column:
In order not to have to type everytime you want to run Python 3
I wanted to link to the default version of Python. Then you need to use the following commands:
I was having many issues with GPU memory management in a Transfer Learning problem when using Tensorflow 2.0 with Keras module. Even though I was using Inception V3 model with all the weights of the architecture freezed, the code ran out of memory. The weights of the model occupied only 98 MB therefore that wasn’t the problem. Input shape was (299, 299, 3) so not huge images were being used, but I couldn’t use a batch size higher than 16.
Recently I have been able to use 2 GPUs for model training. I was having problems while training with both of them (only one of them was being used).
Recently I needed to combine a bunch of images for a book. Since I didn’t want to use a software I thought it would be easier to combine them using Python with the following script (thanks to Python Programming):