Past Entries...

posted this in: General, Servers, Software, Technology
452 Words

The branding & theme was named by my partner, Annie 🙂

I actually started this specific post as an addendum to my previous post; but realised that there’s enough in here that I want to talk about on both a personal and technical level that it should warrant its own entry.

Why “Minty Charmander”?

Well, Annie thought the colour scheme reminded her of a Charmander, and combined with the light green highlights – “Mint” 🙂

The colour scheme uses a number of my favourite colours in a limited palette – purposefully, as I recall from some old design course literally a couple decades ago now, that in UX a small number of colours that can be interchanged and not conflict with each other, was better than a large dynamic swatch of colour for getting information across.

I am using the ol’ trusty Bootstrap framework for the UI and layout of everything. I don’t have any real special rationale for using Bootstrap – it’s just what I’m most familiar with; I think as I ease myself back into coding from a long break, it’s nice to just crawl before I can walk, before I can run 😅

Planetscale?!

Laziness and the idea that I needed a stable service to run a DB for my little projects convinced me to continue with Planetscale – yes, it costs $47 USD a month, but it’s more stable, and more nicely managed than I could ever do with a random self hosted solution.

I decided to continue paying it for the time being, pending further efforts to make things self hosted down the track, but for the time being – it’s nice to have a DB that is:

  • highly available;
  • able to spawn itself into a main, and dev branch

I could probably implement this without a paid system – but I feel like the DB service works as a backend for multiple systems (as it would if I were to self-host) – and that’s a single point of failure that I couldn’t upkeep like a service that is designed to stay online professionally.

End of the day, it’s actually pretty easy to justify the cost of this database for myself; and I’ve spent more on dumber stuff in the past. At least this is a sensible subscription 😜

So, what’s next?

Well, the branding is mostly done, but there’s a few missing things like Search results statistics, and category browsing callouts, to name a couple of things. I’ll be taking my time fixing everything, and eventually hope to start using this blog a lot more to diary more things that I get up to on a more personal level, as opposed to just dumping whatever pseudo-technical stuff comes across my mind!

posted this in: General, Software, Technology
182 Words

We have a new look for the blog! I hope you all like it 🙂

For those that are curious, below is how things used to look (not too long ago!)…

How my blog originally looked

I’ve been using a theme called Independent Publisher now for about 7 years or so. And while I am indeed retired; I didn’t want to let my technical skills (as little as they are) go to waste nowadays. So I updated the look to both jtiong.dev, and jtiong.blog to match similarly in branding now 🙂

It’s been more than a moment since I last touched a WordPress Theme, and so I’ve no doubt that there’s bugs and issues with the new theme you see now (built from scratch lovingly over a day or so, as you’ll be able to see from the Commits on jtiong.dev!). But this change will give me plenty of things to do to fix and maintain for the future with this blog.

And now finally after most of a day’s work, my domains are starting to be a little bit more on-brand! ♥

posted this in: Software, Technology
310 Words

I previously posted about jtiong.dev – keeping my commit logs going and making things presentable for future collaborators and colleagues, as well as my own little bit of self promotion on the internet.

With my recent break in work and taking the time to both sharpen and upskill myself – I thought I’d bring this into the new future with the site pulling from my Github account instead of a now defunct GitLab installation.

In the current “1.0” version:

  • Commits are automatically fed through via CURL request
  • There’s no censorship for any potentially sensitive information
  • There’s no authentication for management of commits/repos
  • There’s no filtering you can do
  • The date/commit times are inconsistent

Not to mention it’s GitLab powered and as I’m currently working through some property changes – I’m bouncing between two properties and the server hardware hosting GitLab is currently turned off.

Queue, my move to using Planetscale as my core DB service for my personal online stuff, and moving my code to Github where possible (because it’s where everyone else is).

For version 1.1 which upgrades this, I’d like to add:

  • A system with Auth that lets me hide a repo, or a given commit hash for any potentially sensitive info from being revealed
  • A filtering system by repo so you can see, per repo, the commits to that repository
  • More accurate representation of the commits/date/timeline

Implementation Logic

Just a couple of loops to go through Repos → Commits is all I really need for my personal scale. Below is a very simplified diagram:

This’ll all be included into some cron tasks that run maybe hourly or so to avoid spamming GitHub.

It’s just simple little pleasurable busywork to be honest, but after about 6 months of not really touching any code and taking my time with stuff, it’s nice to get back into the groove!

posted this in: Ramblings, Software
146 Words

Very recently my time at MindArc sadly came to an end; so I’ve been working to skill up with some of the things I’ve gleaned from my time there. Even though it was a short time, it was a great learning experience coming into an eCommerce agency.

I’m looking at using the following tech stack to resurrect my currently dead jtiong.com site!

  • Cloudflare Workers
  • Cloudflare Pages
  • PlanetScale DB
  • Remix.js
  • Tailwind CSS

I’m adding on to the PHP skillset (although not abandoning it!) – with a few of my other tools I build still using PHP (such as this repo for one of my Rust game servers)

After my time at Padua learning Angular (which I found really confusing); and struggling for a bit – I think I’m ready to dive into using typescript with Remix to do a lot of stuff. This’ll be interesting…!

posted this in: Personal, Software, Technology
667 Words

January and the start of February also brought on a look at better brain dumping knowledge from inside my head, into something tangible. Sort of a legacy thing, I think.

Obsidian has been a note taking app that’s based around the markdown plaintext format that’s been around for years. A lot of my friends with a hyper technical background have been huge advocates of Obsidian and in an effort to find something that’ll let me brain dump with an intelligent linking method that is code friendly — Obsidian came up time and time again.

Obsidian’s Graph View links Notes and provides a visual representation of these Notes to show how they relate to each other! It’s really interesting to navigate around!

This blog post is generally just me rationalizing why I’m switching to it, over the two existing services I use (and pay for) – Notion and Clickup – both of which are fantastic apps for people who need something a little more fancy. But up front, I think it’s best to talk about the cost of these apps. Both of these apps are wonderful; they cost money however, at $5 USD and $12 USD respectively, and this adds up to a little over $200 USD per year. More than one might think to affect one’s finances in these trying times!

And so…! In a bid to move towards reducing my overheads, I thought I’d look into DIY solutions that I can integrate or piggy back on more critical services. In this case, Obsidian – which can use iCloud Drive to store itself works well. iCloud isn’t a service I can easily get rid of – my mobile phone, my tablet are both rooted deep in the Apple ecosystem, as there’s health related devices and apps that are better on iOS than in Android or Windows for my situation (your mileage may vary of course). Lucky for me though, that this is still usable across Windows – meaning I technically don’t need to worry about something like the paid Obsidian Sync service.

Security is also another thing I find myself concerned a little bit about. There’s not much I can do about state level bad actors gaining access to my data (and I don’t think anyone’d find use for it) – but your typical cyber criminal is still a concern because they’re on an interpersonal level. Last thing I need is sensitive data (like health records) getting compromised and having them leveraged against me. But to make things worse, it turns out that Notion isn’t encrypted on any level – which kind of explains why it’s so easy to publish something directly to the web.

Scary.

Clickup is also web based and doesn’t do much better. I feel like Notion and Clickup don’t have the resources to build privacy on a level that Apple does with its iCloud services. Having been subject to some very public breaches of customers data (not Apple’s fault – they got socially engineered) – Apple has no doubt more than doubled down to make sure it never gets the blame for any cyber security breaches.

So, all in all – I’ve moved to Obsidian and as of the time of this post, it’s been almost 2 weeks. So far I’ve started to slowly port across the knowledge dumped in Notion into it. It’s a long, slow and tedious process, but the beauty of the way Obsidian draws links between articles (Wiki-esque) means that I don’t have duplicate style documents, unless I make them forcibly within the file structure of the Vault itself.

It’s also nice that I can write SQL-esque “Dataview” queries that can generate lists of pages within things. It feels a lot more like a programmer’s knowledge assistant than a “Note taking” application.

It feels natural using Obsidian now, and I keep improving how I use it as I go along, it’s still got that shiny new “Learning new hacks all the time” feel to a new application.

It feels like The Right Moveâ„¢

posted this in: Networks, Servers, Technology
530 Words

As my Homelab and private discord community has grown, I’ve needed to roll out more than just websites, but also web applications, and differently ported things that need to be proxied back to a domain or subdomain address. The old setup was horrible… so I set about fixing it once and for all in November and December, 2022.

The Problem…

I’m stuck with something that looks like this:

The Old Proxy Setup

This essentially means for each website I deploy, I’d need to essentially double proxy myself; and it was honestly a little bit confusing to work with SSL certificates.

How did it come to this?

One of the legacies of my time at Hostopia was building a Docker based local test environment that was portable and rapidly deployable; using an nginx-proxy container, with apache containers for the websites behind the proxy container.

The beauty of this setup was that I could quickly roll out a website as needed anywhere with the magic of Docker. And for my initial purposes, that was fine.

The problem arises when I try to roll out secondary services, like GitLab, Minecraft Maps, Game Server UIs etc. which are all related to various non standard HTTP(S) ports, but need to be reverse proxied to subdomains, etc. (an example being https://map.northrealm.info — which is a Minecraft Server Map that runs on port 8123). I’d have to have ALL THOSE RESOURCES on a single server. Or each additional server could be a double proxy to account for extra servers. This isn’t very efficient.

And secondly the bigger problem – was organising and renewing SSL certificates, it was a hassle tracking and renewing or making new certificates as needed as it was being double routed first through Nginx Proxy Manager, then secondarily on the local docker container host the app/site was located on!

So as for why it was configured like this? A mix of speed, and laziness in doing things “The right wayâ„¢”. What was supposed to be quick and easy eventually just became a hassle that wasn’t working properly.

The Solution

My services infrastructure now looks like this:

The New Proxy Setup

It might not seem like much – but it’s now server hardware agnostic, and I don’t need to install a separate cluster of containers to manage locally any sites or apps per server.

NginxProxyManager (NPM) now acts as that cluster of infrastructure containers that span the full home network as opposed to being tied down to one host. Custom nginx configurations are created “per host” in the app, and they handle how pages and content are served for sites, or direct traffic specifically for a given application.

Much better I say! 😀

*Facepalm*

There are plenty of ways to skin a cat; and this is definitely better than the original setup! It’s also not a perfect solution, but this blog post wasn’t written in an attempt to find absolute perfection (I believe it’s something to strive for, you can’t achieve it unless you’re a divine power) – and it’s more to document the journey of my ignominy and learnings as I go about running a homelab that actually gets some use 🙂

posted this in: General, Personal, Software, Technology
365 Words

I’ve been back into coding this month, on my own projects and not just for work. The passion isn’t “burning bright” anymore, but I’m working towards reigniting it by finding coding little bits of things doing what I want, etc.

https://jtiong.dev is a bit of a commit msg logging script that I had written and integrated with my local GitLab installation; but as you’ll see if you visit the site – October’s my highest number of commits in recent memory on personal projects.

The reason for the skewed figure is because up until about August 2022, I had most of my projects stored in GitHub. Some part of me still thinks I should keep things in GitHub – but I’m looking into using that as more of a backup style system.

The Code Backup Project

I know GitLab has a “mirror repository” feature – but unfortunately it doesn’t seem to be working very well for me (too fiddly).

So to get around that I’ll be looking at building my own little automated flow:

It’s not an ideal setup – but I think it will work for my needs. Because of how nagging it is, I may well end up writing the automation script entirely in PHP (this way, I can integrate notifications to myself over Discord and other things).

Other Projects

This month though, I also worked on:

  • snackpack.gg – integrating it with my Snack Pack discord server which’ll let friends and family login with their Discord accounts, and see private content on the domain (members only areas)
  • topdownshooter – my first sort of game project, name is self explanatory, written in Godot Engine, sort of to prove to myself that I can make a game that’s more than just a random prototype. It should have levels, a menu, and be packable as a real release
  • Private Broadcasting System – a private broadcasting system for a friend – she’s an online radio DJ and runs a virtual club which people can tune in and listen to/participate in a talkback radio show

All-in-all, it’s pretty cool to get back into doing some tinkering things, and having the time and wherewithal to do them.

405 Words

Recently with my career, health has become a thing I’ve been a lot more conscious of. Physical, mental, etc. So I’ve made the decision to move to 3 days per week, leaving Thursday and Friday available to me for health care and rest.

I’d like to eventually transition to a career in which I can work more independently as well, and the career options are pretty simple:

  • Come up with a product – build and sell it (SaaS, etc.)
  • Come up with a service – promote and sell said services (Contracting, etc.)

I have a couple of projects and things that I do which fall into the second category – I do some web hosting and consulting on the side, which produce some income for me. So I feel like I could certainly return to pushing those paths a bit more if need be.

Games are a passion project…

– me, now.

However, I’ve always wanted to build a game. Since I was a kid playing Super Mario Bros. on the NES back in the 90s, all the way through my adult life – I’ve always been a gamer.

In my mind, games aren’t something you build to make money – sure there’s that one in a million opportunity to build a Minecraft, or the next World of Warcraft. But that’s both extremely rare, and extremely difficult to achieve. Games are a passion project, and if you’re lucky, you get a financial reward if you find something that strikes a chord with the gamers who try your game out.

I’m at a stage in my career where I can afford one last hurrah at a passion project beyond the gaming events and marketing adventures of yesteryear.

Time to give it a go!

Do you have a plan?

I’m not quite sure about the games I’d like to make yet. But I think the plan is to build:

  • Some basic indie games to learn games development, and;
  • learn some basic art creation (2D – Aseprite, 3D – Blender) to flesh out said games, and;

In terms of sound creation and audio design – I may just leave to 3rd parties, if I’m honest – Audio is always and will forever be a dark magic for me

Okay…

So why am I blogging here about something I haven’t even started?

To keep myself publicly accountable. I’ve already told my mates on Discord, now I have to just execute 😂

posted this in: Servers, Technology
187 Words

So, my friends are big proponents of using https://ossrs.io (Open Source Simple Realtime Server) – which (in the way I’m intending to use it) will ingest RTMP streams from various OBS clients (my friends) and save their footage in a folder, which I can then use to edit various highlight reels from our gaming nights together.

The benefits are:

  • Storage – it’s saved on my end, so friends don’t need to worry about storage
  • Speed – the footage is broadcast directly to me and saved, they don’t need to send me files after the fact
  • Simplicity – they don’t need to worry about anything – most of them have OBS configured for streaming already

It works something like:

Pretty straightforward, pretty easy.

By default I’ve configured everything to save in 1800-second long .flv files (30 minutes) – this could mkv or otherwise, but the quality is fine enough for my non-cinema-quality productions.

It’s a convenient little service I’ve set up now for any future gaming nights as well. It’ll be great to get footage as needed and stored.

Now to find some sort of Media Asset Management system…

posted this in: Personal, Software, Technology
293 Words

I work on a lot of different coding projects from a personal perspective. Usually depending on mood, or other factors, I will jump from project to project, working on them whenever time permits. My most recent role in my career left me a little bit short of coding and development time; focusing instead more on project delivery, and team management to get a big product across the line at a FinTech startup.

So, to get my coding juices flowing once more, I thought I’d whip up some self-hosted work:

  • Deploy and install GitLab to manage my code
  • Build a site to track my activity to get back in the swing of things

Cue, https://jtiong.dev

What’s the purpose of this?

It is a simple quick overview of my activity on the code repositories that I’d ported over to my localized GitLab installation. It will let me see what I’ve been working on recently, and keep me motivated, on continuing to polish my skillsets.

What this project demonstrates

Well, this demonstrates several things I thought might be handy going forward:

  • Deploying an NGINX Proxy (on an external machine), via Docker Compose to manage the incoming connections to jtiong.dev and other various websites
  • Deploying GitLab Community Edition onto a different physical server for my personal use and code management
  • Creating a Docker Compose deployment based off the php7.4-apache image for the site
  • Using my own framework Spark – to create this site
    • Using a self written URL Router
    • Using PSR-4 to autoload classes
    • Interacting with the GitLab REST API
    • Using Bootstrap 4.x for a very simple frontend

Source code to the site can be provided as needed to, although I really need to clean it up.

It’s been good getting back in the saddle for some basic web development again! 🙂