thinkingincode.ninja [Dark]

How the site is built

Suggested previous reading: Details on how the site is hosted

Initial requirements

When I had the idea to make this site, I had two definite requirements for appearance and functionality. The first was simple: the site had to have a dark theme to complement the default bright theme. I'm a huge fan of sites and apps that have dark themes, they're just so much easier on the eyes. I know though that not everyone is a fan of bright text on dark backgrounds (and truthfully during the day I sometimes prefer bright themes over dark themes), so that meant that there had to be a choice for people. I had (at least) three ways to do this, so I knew I could make it happen (those three ways being to have a client-side JavaScript button to toggle the theme, have PHP that would give different CSS depending on a query parameter in the URL, or have a different subdomain that would return different CSS).

The second requirement was a little more nebulous and a bit more complicated: whatever I came up with, the site had to be in line with my general principles, and be a site that I personally would find appealing even if it weren't my site. That basically meant the following:

The first is self explanatory. I, along with most others I imagine, don't like ads. While I understand their purpose, and I understand that developers want to make money somehow (because people generally expect free content and don't generally pay to support websites), I dislike how ads are implemented now. Most ads are obnoxious, slow, they waste your bandwidth, they are another vector for malware, plus we are generally overloaded with advertisements and marketing campaigns... Just unpleasant. Because of how I dislike ads, it just wouldn't sit well with me if I were to turn around and add ads to a site or app that I create. Eventually, I might create a Patreon or something, so that I can upgrade the hosting of this site and make it load-balanced (thus increasing the uptime of the site), but I would never, ever resort to ads.

The second and third points blend together. Basically, I want a site that is fast, because why have a slow site? To that end, that meant, amongst other things, that I wanted there to be as few HTTP requests made as possible in order to load my site. At minimum, that meant bundling any JavaScript and CSS in the page, not having them be separate files. As I started working on the site, and making my first scripts, I realized that it also meant no PHP, and no JavaScript, at least where possible (and so far, that's everywhere, I have no JS or PHP on my site at this point). Basically, I realized that I could statically generate HTML, and that, overall, it would end up being easier, faster, and more performant than having every page be dynamically loaded.

But continuing with why I wanted speed, and what that meant to me. Lots of sites now have lots of JS and CSS files they include, and I think it's largely unnecessary. Yes, having your JS and CSS in independent files means the browser can cache them so you don't have to redownload the same data across multiple different pages. But, honestly, does your site really need that much CSS/JS? There's a markup language called AsciiDoc which I was hoping to use for my site, because it'd let me easily write posts and have syntax highlighting done for me, and it generates nice looking output. But when I looked at the HTML, and saw that I got 37KB of HTML and CSS for something I could do myself in less than 1KB if I wrote it by hand, I just knew I couldn't go that route. That's just insane amounts of bloat, there were dozens of lines of CSS outputted that were never used in the generated HTML. Plus the amount of nested divs was just terrible. I would rather, and did, write it by hand and design the appearance of the site to be minimal and spartan, rather than intentionally include such extreme baggage.

This leads into my next point, which is that there shouldn't be unnecessary dependencies. Thus far, I've succeeded in that endeavour, as I've written everything from scratch. But the idea is, if I can create something on my own that does precisely what I need it to do, I can often do that faster than the whole process of looking up what's out there, comparing different solutions, trying to implement one, inevitably finding where it fails to meet my requirements, trying to fix it or finding a different solution, etc. Plus it's more fun (to me at least) to tinker around and come up with the perfect solution.

Anyways, unnecessary dependencies. Part of this meant minimizing the use of JavaScript, especially for things that could be done with CSS (specifically responsive design, which means having a site that looks good on all devices). Turns out, there are some cool things you can do in CSS, and you can make a site responsive without needing Bootstrap and JavaScript. I have used this to my advantage.

One dependency I am still conflicted about not having is a different font. For the sake of being able to have a site without any additional dependencies after the initial HTML is loaded, I have foregone the use of any custom fonts, thus you are probably seeing this in Courier New. I actually did some research on the speed of web fonts (I might write up a post on this later), and it turns out they can be really bad for page render speed, because until the font is loaded (if it has to be downloaded), text can't render. That's not good. So, I made the hard decision to not include a custom font.

The final main requirement was that the site had to be visually free of clutter. There are some sites out there that are just too busy, that have so much going on that it gets in the way of the content. But I've also seen some really sharp minimal websites. Obviously, I went the minimal route. Personally, I'm largely pleased with the appearance of the site as-is, though there are a few minor things I still am working to improve.

What I did

There are 4 servers set up in my nginx configuration: the main site, the dark version of the main site, and preview versions of both. Each has a different directory for site content.

I have a git repository for my site which I push to a private repository on GitHub. I have two copies of it on my server, one for the preview site, one for the live site. When I make changes to the site appearance/generator scripts, or am working on new posts, I do so in the copy for the preview site. I make changes, regenerate the site, and make sure I like them. Then, when I'm happy with the changes, I commit my changes, push them up, switch to the copy for the live site, and pull the changes back down. When I pull changes down, the live site is automatically regenerated.

All pages have the same appearance: There's a header on top with a link to the index page for the site you're on and a link to the bright or dark version of the active page. The content is in the middle of the page. Finally, there's a footer with the site navigation.

I have a script that pulls together the header and footer, generating the theme links, adding in the CSS as appropriate, and putting the content of the page (whatever that may be) in the middle.

At a high level, I currently have the following folder structure. All data is stored in files as text, there are no JSON objects or anything like that, just files with specific names, with specific content inside.

In the /posts/ directory, as I said, I have a folder for each post. This post, for example, resides in the folder /posts/how-this-site-is-built/. The name of the folder is used for the name of the URL (as is evident if you check right now). Inside of a post folder, the following files can be found (a few are optional, as noted).

I did it this way, using folders and files, because I wanted to just start with something and build as I went. Turns out, this worked great, and I was able to build a pretty nice system. Using files and folders meant I could use git for version control, and that I could easily write some bash scripts to generate my site.

I'll go over the generator scripts in greater detail in a later post, as this post is long enough. But, basically, here's how it works. I have a script that generates a page, whether that page be a static page (like from /misc_pages/) or a post. This script pulls all the components together, includes the appropriate CSS, etc. Then there's a script that calls on the first in order to generate the entire site -- posts, series listing pages, tag pages, other static pages, the RSS feed -- and the first script is called twice for each page, once for the bright version of the page, once for the dark version. The larger script does a lot of validations (though the page generator does as well), and makes sure to respect the generate and publish flags for each post. It directly creates a few dynamic pages, outputting HTML to be wrapped up with the page generator script. The script also cleans up files that aren't live anymore (such as if I hide a post).

The site generator script is run for the live site every 15 minutes with a special flag, causing it to only regenerate the site if there's a new post able to be published (based off of the optional publish-after file). The live site is also regenerated automatically when changes are pulled down from the GitHub repository. The preview site is regenerated whenever I change an HTML file. I can also regenerate either site at any time by running a command.

All in all, I've created a pretty robust system for building this site. I created the core of it in 3 or 4 days, though I'm adding in more as I go, to make the system even better. At some point in the near-ish future, I'll make a post detailing the generator scripts themselves, but I have more tweaks to make to them before then. Hopefully though this post helps you better understand the inner workings of this site, and the efforts that have gone into it. As always, thanks for reading, and please feel free to reach out to me with any questions or comments, or just to say hi!