Category Archives: Coding

Together.ai

Connecting Local Open WebUI to Together.ai Models

I wanted to learn more about running Open WebUI locally through Docker while also testing out the new Llama 3.2 Vision models (11B and 90B), but my system with a 6GB VRAM video card wasn’t going to be able to run the inference model locally, so I needed a free or inexpensive option; enter together.ai. Their free account offers the 11B model at no charge while also having easy access to the 90B model. So, I downloaded Docker Desktop as the getting started docs suggested and then my inexperience hit a roadblock: How to run a docker terminal command and what command to run?

Continue reading

Piwigo Logo

Piwigo: Configure Custom SMTP on Dreamhost

Piwigo is open-source software for sharing image galleries easily. Installing it on Dreamhost (I’m using a VPS but any type of hosting works) was super easy, but by default any emails sent would have Mail System Delivery Failure due to email security problems.

Luckily, the solution was to create an email account in Dreamhost and configure SMTP in Piwigo. Here are the steps in a new installation:

Continue reading

child reading book

Installing Castopod on Shared Hosting

Castopod is podcast hosting software that allows you to easily serve your podcast(s) from your own server rather than through a secondary service like anchor.fm, blubrry, and libsyn. While I personally use anchor.fm for some podcasts which I wanted to easily monetize, for a new podcast, Story Suggest, I had no desire for it to get monetized by me or anyone else. Furthermore, I also wanted to dip my toe into integration with the fediverse. Don’t worry if you don’t know what that is, it’s not critical to any of this.

Either way, while the installation documentation for castopod is relatively clear, I felt like there were a few quick notes/tips I could add for those installing on a shared host like Dreamhost. While I maintain a VPS at Dreamhost, the functionality and installation is the same. Below I’ve highlighted some clarifying points to help others quickly get setup with Castopod.

Continue reading

discourse

Fixing Yarn ESOCKETTIMEDOUT Error During Discourse Setup

While installing Discourse on a free tier instance in a Google Cloud Compute Instance, I was following this discourse install tutorial with only minor adjustments:

  • Used Ubuntu 20.04 LTS minimal
  • Standard Disk (default is Balanced)
  • e2.micro instance

However, when I was waiting for Step 9 to complete (after running ./discourse-setup) and everything being built, it would fail with an ESOCKETTIMEDOUT error related to yarn. The last message that tries to run is [ ! -d 'node_modules' ] || su discourse -c 'yarn install --production && yarn cache clean'.

Here’s what is going wrong. Yarn has a default timeout that is fine if you are using the minimum recommended for Discourse, but with a micro instance, it takes too long. To fix this, you have to manually edit one of the install scripts that runs for the new Docker container Discourse is building.
Continue reading

discourse

Installing Discourse with Amazon EC2 t2.micro Instance and SparkPost

This is more notes and reference than an in-depth tutorial, but after spending a few hours trying different things, here’s how to get it all set up. Remember, just as Discourse recommends, a t2.micro instance only has 1GB of memory, so if you intend to grow things to an Internet-wide audience, you should use a t2.small instance instead.
Continue reading

Dropout Rate compared to Accuracy

Testing Dropout Rates for Machine Learning with FastAI

As I continue my adventures in machine learning through the FastAI courses, I wanted to explore the concept of dropout rate. If you would like to see the Jupyter Notebook used for these tests, including full annotations about what/why, check out my machine learning github project. Specifically the Testing Dropout Rates (small images).ipynb.

Really quickly, dropout rate is a method in Convolutional Neural Networks (CNNs) of removing neurons (e.g. in the first layer of an image this would be individual pixels) to prevent overfitting (i.e. doing notably better on the training set than on the validation set) and thus increase the general applicability of the model. In other words, block a percentage of the material to force it to not become to overdependent on repeating patterns that lead it astray.

These tests were setup to isolate dropout rate as much as possible. Also, while this test was using ResNet50, results may differ using a different model. Okay, enough jibber-jabber, let’s jump right to the conclusions, shall we?
Continue reading

Machine Learning Install on Windows with Fast.ai

When getting started exploring machine learning, you will likely come across the free lessons at Fast.ai. These lessons require a few gigabytes worth of programs and algorithms as well as access to a powerful GPU from Nvidia (e.g. GTX 1060). The first lesson even walks you through setting up a cloud server for just that purpose, but what if your PC already has a powerful Nividia graphics card? What if you use Windows?

No problem. This quick guide walks you through the process of setting up a local environment for machine learning, starting with the Fast.ai tutorial series. It’s designed for Windows PCs with an Nvidia graphics card. Alright, let’s get started with a few quick downloads.

Continue reading