I wanted to learn more about running Open WebUI locally through Docker while also testing out the new Llama 3.2 Vision models (11B and 90B), but my system with a 6GB VRAM video card wasn’t going to be able to run the inference model locally, so I needed a free or inexpensive option; enter together.ai. Their free account offers the 11B model at no charge while also having easy access to the 90B model. So, I downloaded Docker Desktop as the getting started docs suggested and then my inexperience hit a roadblock: How to run a docker terminal command and what command to run?
Category Archives: Machine Learning
Testing Dropout Rates for Machine Learning with FastAI
As I continue my adventures in machine learning through the FastAI courses, I wanted to explore the concept of dropout rate. If you would like to see the Jupyter Notebook used for these tests, including full annotations about what/why, check out my machine learning github project. Specifically the Testing Dropout Rates (small images).ipynb.
Really quickly, dropout rate is a method in Convolutional Neural Networks (CNNs) of removing neurons (e.g. in the first layer of an image this would be individual pixels) to prevent overfitting (i.e. doing notably better on the training set than on the validation set) and thus increase the general applicability of the model. In other words, block a percentage of the material to force it to not become to overdependent on repeating patterns that lead it astray.
These tests were setup to isolate dropout rate as much as possible. Also, while this test was using ResNet50, results may differ using a different model. Okay, enough jibber-jabber, let’s jump right to the conclusions, shall we?
Continue reading