Past Entries...

101 posts since April, 2016!
posted this in: Personal, Software, Technology
293 Words

I work on a lot of different coding projects from a personal perspective. Usually depending on mood, or other factors, I will jump from project to project, working on them whenever time permits. My most recent role in my career left me a little bit short of coding and development time; focusing instead more on project delivery, and team management to get a big product across the line at a FinTech startup.

So, to get my coding juices flowing once more, I thought I’d whip up some self-hosted work:

  • Deploy and install GitLab to manage my code
  • Build a site to track my activity to get back in the swing of things

Cue, https://jtiong.dev

What’s the purpose of this?

It is a simple quick overview of my activity on the code repositories that I’d ported over to my localized GitLab installation. It will let me see what I’ve been working on recently, and keep me motivated, on continuing to polish my skillsets.

What this project demonstrates

Well, this demonstrates several things I thought might be handy going forward:

  • Deploying an NGINX Proxy (on an external machine), via Docker Compose to manage the incoming connections to jtiong.dev and other various websites
  • Deploying GitLab Community Edition onto a different physical server for my personal use and code management
  • Creating a Docker Compose deployment based off the php7.4-apache image for the site
  • Using my own framework Spark – to create this site
    • Using a self written URL Router
    • Using PSR-4 to autoload classes
    • Interacting with the GitLab REST API
    • Using Bootstrap 4.x for a very simple frontend

Source code to the site can be provided as needed to, although I really need to clean it up.

It’s been good getting back in the saddle for some basic web development again! 🙂

posted this in: General, Servers, Software, Technology
455 Words

I’ve got several servers which I work on, and quite often, this involves running regular cron’d tasks that perform various backups and configuration updates for me at odd schedules (as an example, my Rust server wipes fortnightly, and needs a config update to change the server name to reflect the last date wiped).

To do things like this, I’ve usually just written a script in PHP and run that at a given interval (daily or otherwise). There’s no real reason I chose PHP to write these scripts aside from familiarity with the language, and no doubt the rest could be easily achieved be it through Python, Shell Script or any other language out there.

For now though, PHP serves my needs just fine.

The problem is, I don’t actually keep these scripts backed up anywhere, or organised in any sort of manner!

The age of GitLab

Over the last couple days, I’ve implemented GitLab into my homelab stack (JT-LAB), and will be using it to store most of my code as a “source of truth” and subsequently sync things to GitHub afterwards (depending on the projects of course).

To the Game Servers, Four Branches…

Based off the various server types; specific branches would be used. For now, these would be:

  • Rust
  • Minecraft
  • Factorio
  • Satisfactory

Each game would be represented in its own branch, and based off that branch, would deploy a specific set of commands as needed. For the most part, only Minecraft retains itself in persistence, and the rest rely either on a voted wipe, or scheduled wipe paradigm.

To the File Systems, Five Branches…

Then we have servers with actual file resources and assets that I’d like to keep; things like Photos, Design Assets, old code references, etc. These would be:

  • Media
  • Design
  • Research
  • Education
  • Maintenance

And nine, nine branches were gifted to the Websites

I also run a number of websites for friends and family on a pro-sumer level. I won’t really list these projects, but they do total up to 9! So it all kind of fits the whole LOTR theme I was going for with these titles.

One Repo to Rule them all…

The decision to build everything into one repository to manage all the core backup operations means I have less to track; for a personal system, I think this is fine. Monolithic design probably isn’t the way to go for a much larger operation than mine though!

Announcing…

Cronjobs

So this is the hypothesized project I’d like to build over the next few days; in combination primarily with jtiong.dev which will help track the commits and such that I do. Writing these projects up here as project whitepapers on a more formal basis might help with some resume stuff going forward for my future career 🙂

posted this in: Servers, Software, Technology
217 Words

With the implementation of my JT-LAB homelab; it stands to reason I should probably self-host whatever I can to try and get a decent use out of the stupid amount of money I’ve poured into the project. Being able to ensure I’m only sharing the data that I want to share as well (for whatever reason) – is pretty important to me as well.

Normally, the majority of my code to-date, has been stored on Github (which is fine, it’s fantastic and it’s an amazing free resource for the world). But my workplace had an implementation of Gitlab that I thought was done pretty well.

So, I’m going to take it upon myself to implement GitLab into JT-LAB, and make sure there is a version of my work that runs from GitLab. Github will essentially become my backup for code (Github being way more reliable in uptime than anything I’d run like GitLab etc.)

Why is this so important?

GitLab is going to work as my core repository and project management system; with it, I’ll be able to store and update my code for various projects as previously mentioned

What’s the challenge?

Well, after some wrestling it’s implemented – however the one area that I’m most unsure with, is the AutoDevOps feature of GitLab.

Lots to learn!

posted this in: General, Personal, Software, Technology
400 Words

So, I’ve got a “main” website – https://jtiong.com (which is currently Error 500’ing)

Which runs on a fairly old version of Laravel. Since it’s inception; the site was used mainly as a central one-stop shop for everything about my presence on the internet. Oh how times have changed.

Nowadays, it makes more sense with a number of domains I own, to split up the content and footprint of my stuff on the internet from a singular jtiong.com website, into a number of different sites based upon what people trying to find me for, or to categorise the activities I do.

Domains I have include:

  • jtiong.blog (this site) – my personal blog, which is strictly just personal, non-professional stuff
  • jtiong.dev – where I hope to eventually host some sort of software development info about myself
  • jtiong.network – currently a serverless site experiment, however I hope to change this
  • jtiong.com – a central landing page from which people click through to the other domains

So what does this mean?

Two new projects! The .com and .dev domains which will be important as part of my “online resume” so, I really should get them done sooner rather than later…!

However, this also means I need to really look into how I implement these!

Laravel will be driving:

  • jtiong.com – a landing page/gateway system
  • jtiong.network – services and resources for friends & family

I’m looking at using the Socialite package for Laravel to integrate login via Discord, this’ll mean that certain links and features will only be visible based off friends & family that have certain roles in my Discord server; or at least, that’s been the original intent.

My Own Framework (which I call Spark) – will be driving:

  • jtiong.dev – dev blogs, resources

This dev site will be more of a technical dump to keep me consistently working on my coding skills. The setup of this site is a traditional website that’ll ride on the tails of my intended GitLab installation. The fallback of course, is to just use the GitHub API, but I’ll only start looking at that later.

The site should just start listing out my commits and on what projects they’re made on to try and keep things accountable and interesting. It’s just a cool little showcase project.

More features might be added later relevant to doing development work in the future!

posted this in: General, Hardware, Servers, Technology
107 Words

So it’s been several days since my last post about C States and power management in the Ryzen stuffing up Unraid OS.

I’m happy to report that things have been rock solid and for the last 90 hours or so, I’ve been solidly downloading my backups from Google Drive (yes, many years worth of data) onto the server. At the same time it’s been actively running as an RTMP bridge for all the security cameras around my house, and as an internal home network portal – all without falling over.

Here’s hoping I didn’t just jinx it….

Update: 8th August 2022 — Unraid’s been running solidly for 10 days + now!

posted this in: Hardware, Servers, Software, Technology
157 Words

I recently chose to go the Unraid route with my media storage server; I was lucky enough to be given a license for Unraid Pro, and straight up, let me say:

  1. It’s easy to use
  2. It’s beautiful to look at
  3. It’s stupid simple to get working

BUT

My server uses an old spare desktop I had lying around:

  • AMD Ryzen 7 1700 (1st generation Ryzen)
  • 32GB DDR4 RAM
  • B450 based motherboard

But therein lies the problem. It turns out that Ryzens crash and burn with Unraid by default. You need to go into your BIOS settings, and turn off the Global C States power management states settings. Insane.

Why am I writing about this?

Because it took me 2 weeks to reach this point, wrestling with Windows storage, wrestling with shoddy backplanes in my ancient server chassis (which I then ordered a replacement case which set me back a pretty penny); new SAS controller; new SAS cables…

This is an expensive hobby, homelabs.

posted this in: Hardware, Servers, Software, Technology
238 Words

Local media storage. Yeah.

That’s right, I’m running Windows 10 Pro for a home server 😂

It’s been good so far, the machine is pretty old, but it is there for running things like local media storage, maybe a few other things that aren’t GPU reliant. It has an ancient PCIe 1x GPU in it (a GT 610 haha) that can’t really do anything more than let me remote in and work on the PC.

Although I do definitely want to run:

  • Core Keeper
  • V Rising

On the PC for friends and family to check out 🙂

Storage is a bit interesting; I forked out for StableBit Drive Pool and StableBit Scanner (there’s a bundle you can get) and it’s a simple GUI to just click +add to expand my storage drive with whatever randomly sized hard drives I have.

Why’d I do this instead of the usual zfs or linux based solution?

Mostly to keep my options open; it’s nice seeing a GUI and if Windows can handle my needs for my local network, I’m not doing anything extremely complicated, and the “server” it’s on is going to act as a staging ground for anything pre-gdrive archive.

I could just as (probably more) easily achieve the same results doing this over something like Ubuntu Server; except for the game servers mentioned above. There are some games that just require a Windows host much better, so this is what this machine is for.

posted this in: Hardware, Servers, Technology
58 Words

The JT-LAB rack is finally full; all the machines contained within are the servers I intend to have fully operational and working on the network! Although not all of them are turned on right now. There’s a few machines that need some hardware work done on them; but that’s a weekend project, I think 🙂

Racked and fully loaded…!
posted this in: Gaming, Servers
22 Words

So the new Minecraft version is out, and with it I’ve created a new Vanilla server for my friends to play on.

posted this in: Ramblings, Servers, Technology
140 Words

Just a short little update to myself that I’ll keep. I’ve acquired:

  • Dell R330 – R330-1
    • 4 x 500GB SSD
    • 2 x 350w PSU
    • 1 x Rail Kit
    • INSTALLED AND READY TO GO
  • Dell R330 – R330-2
    • 4 x 500GB SSD
    • 1 x 350w PSU (need to order 1)
    • 1 x Rail Kit (just ordered)
    • Awaiting PSU, and Rail Kit

These are going to help me decommission my Dell R710 servers. Trusty as they are, they’ve reached their end of life, for sure. I’ll keep them as absolute backup machines; but will not be using them on active duty anymore.

R330-1

  • Websites

R330-2

  • Rust (Fortnightly)
  • Project Zomboid (Fortnightly)
  • Minecraft (Active Version)

It’s actually been pretty tricky keeping a decent track of everything; so I’ve recently signed on for some Free Plan tiered Atlassian services using Jira and Confluence. Something a little formal for my use.