PROGRAMMING LANGUAGE FEATURES I'D LIKE MOST LANGUAGES TO HAVE
Here are some things I am dissappointed to leave behind when I switch back and forth between languages that dont have certain features. If Expressions Found in: Rust; probably other functional languages Irritatingly missing from: Everything let x = 5 + if true { 1 } else { 7 } println!("{x}"); // 6 What I have to do instead: let x = 5; if true { x += 1; } else { x += 7; } System.
MUSIC CHANNEL PART 3 - BRANDING
This is the third in a series of posts about analyzing YouTube music channels. Part 1 - Music Channel Introduction and Data Collection Part 2 - Analysis of Channel Activity Part 3 - Branding Branding Remember that we’re basing our channel the behavioral patterns of some other channels: TheSoundYouNeed MajesticCasual MrSuicideSheep Chill Nation Eton Messy MonsterCat MikuMusicNetwork Among their other patterns, we’re going use them as a base for how to brand our channel as well.
MUSIC CHANNEL PART 2 - STATS
This is the second in a series of posts about analyzing YouTube music channels. Part 1 - Music Channel Introduction and Data Collection Part 2 - Analysis of Channel Activity Part 3 - Branding Your Questions, Answered We left off last time with a freshly created database full of information we could query. We sought to answer a few questions: How many videos should we seed our channel with? How often should we upload?
MUSIC CHANNEL PART 1 - DATA
This is the first in a series of posts about analyzing YouTube music channels. Part 1 - Music Channel Introduction and Data Collection Part 2 - Analysis of Channel Activity Part 3 - Branding Background I often listen to new music on YouTube because their discovery systems unearth some occasionally great finds. After a while of listening there however you begin to realize that there are only a handful of channels that get recommended to you, and they all seem to just upload songs and let the views roll in, not unlike a radio DJ.
IT'S ALWAYS DNS
I had decided that I was going to re-visit the way Lyricall’s infrastructure is deployed, because while functional, it was many complicated and esoteric steps away from “one-click deploy”. Near the completion of the .NET Core side of the deploy where it became time to test production-like connections to the primary, real database, problems started to arise. No matter the configuration, .NET was spewing connection refused errors to the postgres backend.
SWIPR - SERVER
Recall our arch diagram from the two chapters ago: The Database Service is a collection of methods opening and closing SQLite connections and, while important, is really exceedingly boring and furthermore, unsurprising. As such, we’ll leave it out of the discussion. We’ll also leave out building the Razor Pages and the views, dealing with identity, migrating our initial user tables with EF Core and other such minutiae of running an ASP.
SWIPR - DATASTORE
We need a datastore. Primarily so that we can stow a given user’s Tinder information – Facebooked ID, Facebooked password, Facebooked access token, and Tinder access token, – but also because we now have a complicated application it’s state must be preserved somehow. We’re going to skip talking about tables relating to how ASP.NET Core projects do identity because there are a lot of tables for that. The only thing worth pointing out with respect to that is many of our tables will keep a foreign key to the autogenerated AspNetUsers.
SWIPR - LIBSWIPR
As a harness for our model, we’ll create a server that serves as a centralized access point. This server provides the necessary facilities to register an account with it, interact with Tinder, and receive responses from the model. The general architecture for the server will go something like this: SwiprServer is the server, in our case, an ASP.NET Core 2 project, and LibSwipr is our custom tooling around all the other interactions.
SWIPR - SWIPR SCRIPT SERVICE
Now that we have a trained model, we need to consider how to interact with it. The full tech of the script can be found here on GitHub but we’ll walk through it here too. The overview of this step is that we have a python script acting as a local server where the client side of the script receives input and returns output to to external consumers, and the server side loads the PyTorch model and runs the computation.
SWIPR - FAST.AI AND CNNS
Finally we can consider creating the image classification model. The jupyter notebook that was used for this process can be found here on GitHub. DataLoader The first thing the creation of a PyTorch model requires is the creation of a DataLoader, which is a neat structure for handling the data we want to go over and what their labels should be. It loads x’s (inputs) and maps them to y’s (outputs).