How I Make Bitmap-esque Art
Nixx is back, back again.
I was feeling the itch to spam this site with more nonsense posts before we hit 1000 IPs. At the moment of writing this, there were 873 recorded this morning. I've been fairly busy with work and courses for a while, but all of a sudden the storm's cleared for the moment.
I've added a few more backgrounds to the site, among some other minor changes. You can get the backgrounds on my website repo, under the /images/backgrounds/ directory, alternatively you can find them on the bg.php page which is used in the main stylesheet. Since I've done that, I thought I'd mention how I make the style of images I use on this site, as it's surprisingly easy.
Making the Images
For this, I'm using GIMP, get it there or within almost any Linux distro's repositories. I am using version 2.10, though I'm sure it applies to many other versions. These aren't necessarily complex actions though, so a similar method likely applies to other editors. I would bet money on there being even a way to automate it with imagemagick I'm as of yet unaware of.
High-res, heavy-contrast images are recommended.
When you have the image you want at the scale you want, go to Image > Mode on the ribbon, and select "Indexed". Under Colourmap, select "Generate Optimum Palette" and choose 2 colours. Under Dithering, I generally prefer "Floyd Steinberg (reduced colour bleeding)" - play with the others to see the effects.
The hard part is already done
You now have a two-colour image. Congrats. Wasn't hard.
Because the two colour palette should be a mean of the two most common colours on the image, you may or may not be happy with the colours you find. You can change these colours fairly easily though. Go back to Image > Mode again and return the image to "RGB" - releasing the image from being stuck in only two colours. The colours can now be whatever you want.
I have two main "palettes" of colours I will change my images to at this point - one is simple white on black (used in the backgrounds), and the other is pink on transparent (used in many other images around the site). You can use this for whatever combination of colours you want, too. Go to Colours > Map, and select "Colour Exchange". From there, it's pretty clear how to pick a from and to colour to change each of your two colours. Under Colours, "Colour to Alpha" may be used to make a colour transparent, although you should make sure the opacity threshold is at the minimum, to prevent the transparency from bleeding into the other colour.
And colours, done
Here's one I made just for the sake of this example:
I recommend exporting them as a gif - as a jpg loses fine detail, and png isn't as well suited for this kind of compression. Gifs are great for their low file size, comparative to resolution.
I was partially inspired by the work of Mattis Dovier - although what he does looks much better than mine, and I could guess he does much more of his work manually.
All of the above belong to Mattis Dovier, not me.
And of course by Fauux, who makes far more interesting things than I ever will.
The above can be found on fauux.neocities.org, and does not belong to me.
Of course, rather than choose just two colours, you can achieve a nice aesthetic with 4 colours, or 8 colours. Under Filters, there is "Blur" and "Distort", which host a variety of interesting tools to play with, including pixelisation and video degradation effects. Filters in general are worth thoroughly exploring. In the end, with minimal effort you can get something like this:
Also, if you use a literal bitmap (not a pretend one), you can edit it much more manually. Bitmaps can be edited in Vim, and twisted in Audacity, provided you miss the first few percent of the image's data (the header). Really. In reference to this post, the "heart" I made was a mix of GIMP and Vim. The glitching star animation was a mix of GIMP, Vim, and Audacity. I've seen people make far more impressive images with presumably a greater time investment.
Touhou 1 - 5 (and PC-98 in general) on DOSBox-X
I completed Mystic Square in one run a few days ago, which reminded me that getting it to play correctly was once a pain, so I thought I may as well write about it.
Games about cute girls with frilly hats. In the mainline series they shoot at each other with magical powers.
It's a bullethell game, with just the right amount of edge to make me feel invested and exasperated when I finish the game, but no so much that it's impossible to finish it in one run. Major enemies and bosses are by far the most fun, often having several different attack patterns you need to very quickly learn to advance through the game. With (later games having) many playable characters, difficulty levels, and choice of initial extra lives, it's extremely replayable, and very easy to pick up and put down whenever.
The music also deserves a mention, because the composition is fantastic and I will load an emulator just to go on "Music Room".
It's great for a mid to late 90s arcade style game, the later of the PC-98 series fitting on 11 MB and the former 2 on half of that. I would argue the games became more engaging, balanced, and showed "more experience" from the developer as the series went on. The only thing I feel really lacking is a save feature, or a functional save-state alternative in my emulator (I've yet to try others besides DOSBox-X - so feel free to make me feel stupid).
Also, I play because it's free (as in "free beer"). I'm the cheapest person I know. Any game released for the PC-98 is abandonware, as the PC-98 was discontinued is 2003 and stopped shipping in 2004.
Copies are Hard to Come By
The font.rom file is essential, so download that too.
There are no strong reasons to be fair, I just like that DOSBox-X can emulate a variety of environments ("'Bloat, bloat,' they howl"), including MS-DOS and PC-98. If you only care about PC-98, xnp2 is a fine choice, and referenced on the TouHou Files page.
Obviously, get DOSBox-X. The compilation is run by a makefile - it's a rather large program, so expect it to take a while and run your CPU a little toasty. Alternatively, I hear of an RPM and a Snap existing, but I prefer to compile. The wiki discusses some of its more interesting features.
You'll find a file in your build directory (assuming you compiled - if not, you're on your own from here) called dosbox-x.reference.conf. You can safely copy the whole thing to dosbox-x.conf, and from there start editing. Or create an empty 'dosbox-x.conf' file, and include only what you need for clarity.Some of my useful options, for PC-98 emulation:
Yes, the headers in square brackets are necessary. Some are essential, like "machine = pc98", some are just personal aesthetics, like "scaler = rgb2x forced". Play around - you'll have a different monitor to me, and likely a different keyboard (mine is jp106). There are probably plenty of useful settings I don't know of, I've just done the bare minimum to have it functional and aesthetically pleasing. Many of these settings can be found in the GUI, but this sets them permanently and is more robust. I also believe these can be passed as parameters when executing "./src/dosbox-x", but I don't know why you'd want to do it that way and I've not tried it.
Set-up - font.rom
Additionally, remember the font.rom file from earlier? Drop that in the build directory. Things should look a lot better after that.
You should now have an emulated machine capable of playing PC98 games, and making them look good on your screen at that. Congrats.
Actually mounting the files to boot and play
There was also surprisingly little information on this part, so here goes.
You need to be in the bottom of the root directory when executing the program, where font.rom can also be found. I have a simple bash alias set up like so:
When you're actually within the emulator:
Your game, should, hopefully now be good to go. I've found minor issues with TouHou 3, TouHou 5 works great. You can play any PC-98 game you can get your hands on, in theory.
The 3 images in this article do not belong to me - they are screenshots of gameplay of 東方怪綺談 ～ Mystic Square, and the game belongs to Team Shanghai Alice/ZUN.
DOSBox-X includes the functionality to take a screenshot, which you can bind to a less awkward key. This takes a screenshot of the game without any of the aspect or scalers effects, which frankly look better than the screenshots I took with scrot. Oh well, wish I knew that before I made this post. Nonetheless, it at least gives you a good idea of how my configuration settings may make the screen look.
 2021.04.12: This was originally a link to 2h.ryhl.io, but they seem to have since then gone down/stopped hosting the files.
A Public Git Server People can Clone From
But not necessarily edit - without your permission.
Obviously, you need to have Git installed, and you need SSH access to your server. I'll let you work that out.
If you already have a domain with web content on it, you'll want to give your Git server a separate domain and root directory. See your web server software. Most subdomains e.g. git.*.* are generally free.
I'm going to make an effort to not repeat the words of other people and other articles because it's just filler, in setting up a basic Git server for your own use, refer here.
You basically need 3 things at this point - your personal, public SSH key; a repository on your machine; and a "bare" repository directory on the remote server. Run ssh-keygen if you don't already have a key, it's fine (I prefer) to leave it blank.
You may like to add the directory "git-shell-commands" to the home directory of your Git user, this allows for a Git prompt on logging in as the Git user.
If you haven't done so already (this is a system-wide procedure), you may also want to disable password SSH logins, and rely purely on your key. See "/etc/ssh/ssh_config" and "/etc/ssh/sshd_config", or your system's equivalent.
Having Completed the Above
You should be able to push and pull to your repository with your SSH key. You can add collaborators to "authorized_keys" also.
However this isn't particularly useful for a public Git server - in order to clone from your repository, users would need to have an authorized SSH key.
SSH by design only allows for authenticated logins. So, you need to depend on HTTP.
A Git Server with HTTP Read (but not Write) Access
First of all, just make sure you have an open HTTP (80) or HTTPS (443) port.
Two things are needed:
Firstly, you'll need to set up the Git daemon, to allow exporting of given repositories. This is done on a per repository basis.
It's simple enough to just run the command, and go. My Git repositories are stored under "/usr/share/nginx/git", that's reflected here.
But it is far more sensible to include it as part of an init script, with whatever init system you happen to be using. Systemd shown below:
Adjust as you need, move to the relevant location, and enable it.
With the daemon enabled, repositories to be shared over HTTP should have an empty file called "git-daemon-export-ok" in the repository directory. This takes effect instantly. However, it won't work without the second part:
On your server-side repository, you will find a directory called "hooks". The hook "post-update" must be enabled - this is as simple as moving "hooks/post-update.sample" to "hooks/post-update".
For this to take effect, run "git update-server-info".
You should now have a working HTTP Git server. You can run, for example, "git clone https://git.concealed.world/website", and get my "website" repository. Full paths are not needed - the repository is taken from the root of the web server. No one has write access, nor can they SSH into your server without the correct authentication.
How is anyone going to see or find this, though?
That's where stagit comes in, a web front-end for Git. Clone and compile it on your server.
Repository pages are created by entering the repository directory, and running "stagit ./". A listing of repositories can be created by running "stagit-index dir1/ dir2/ dir3/ > index.html" for each repository. To get your web front-end to reflect the most recent changes, you'll need to run stagit again.
Stagit formats the pages with 4 files:
"url" contains the url to be displayed as the "git clone" link, the rest are fairly self-explanatory.
You can set these to be different values for every repository directory - but you likely want them to be consistent. Rather than maintain many copies of them, you can simply copy them to each repository with "ln -s ../style.css", etc.
That's way too much to maintain for a single 'git push'
You are absolutely right.
Which is why I have a script that does all of the above for me automatically (did you guess?). Heavy use of "find", an underappreciated GNU program.
It checks for running as the root user, and moves into the correct directory (further explanation later).
Then checks for every subdirectory of the current directory. All of them are added to "stagit-index", and output to "index.html" (git.concealed.world's index file).
Next, it enters each directory one by one, and performs various checks, to see if files exist:
- If style.css doesn't exist, make a symbolic link
- If favicon.png doesn't exist, likewise
- If logo.png doesn't exist, likewise
- If url doesn't exist, output the correct directory to url
- If git-daemon-export-ok doesn't exist, create it
- If post-update is not enabled, enable it and update server info
- Lastly, run stagit
The above assumes all repositories are desired to be public.
You can use hooks to detect 'git push', or just run the above script periodically using cron. Because scripts use the directory they run in, I set the "cd" at the start and end for convenience. I'm using cron as my server isn't very high-maintenance.
That should be about it. Following this, you have a web server serving Git over HTTP/HTTPS which people can readily clone using a web interface, but only you can SSH in and push changes. So, anyway. Enjoy.
I thought before I throw up any more of my hacked together scripts across the front page of this site, I should talk a bit more about scripting in general - referring to shell scripting on GNU/Linux systems, in particular.
Why would anyone care?
Scripting offers up far more potential for the tools you have at your disposal, to create larger meta-tools from them without writing it out all over again - at its simplest. As everything on *nix is a text file, and most things can be achieved by editing text and text streams - it's fantastic for creating incredibly specific, precise tools suited to exactly what you need, while taking out the manual element. There is no need to write anything twice.
For example, if you...
- ...want to parse input from HTML pages on the web, sort it, and use the output to display on your screen automatically.
- ...want to name files in a long list of subdirectories according to criteria based on their file size and last modification.
- ...want to take a CSV list of e-mails, and send death threats to all of them in a few minutes.
I can think of ways in which all of the above can be relatively easily achieved. These are just very specific examples, to illustrate that the key here is to achieve very specific things with the minimal amount of effort.
I also think it's a pretty strong selling point for the GNU toolkit and those like it on Unix-like systems, although to be fair such things may be provided in similar depth in other operating systems - I've not checked. I'd love for you to prove me wrong.
With this post, I aim to give a reasonably comprehensive shorthand to scripting on Unix-like systems, as a reference point to anyone looking to script something they have a rough idea for. It's not at all complete, but should give a good start for something you can read through and finish relatively quickly. It gives you an idea of what options are available, and where a good starting point to learn would be.
Some helpful concepts
stdin - Input from the terminal/console, or input piped from another program.
stdout - The result output from the program, either to the display (shown as text on screen), or piped into the next program.
variables - Variables can be used in the same way as any other language, to refer to a value. Typically this is altered over time. Below shows the syntax of a variable being saved, and output to the screen:
piping - As above, this takes the output of one program and creates it as the input of another. This is done with the "|" character. See:
This will print the second word of "Hello world", as echo pipes its output ("Hello world") to awk as awk's input, and awk outputs the second word of its input to the screen - awk is the last program in the chain.
Other characters to chain together programs:
- ; - wait until the program ends (whether successfully or unsuccessfully), then proceed onto the next program. E.g.:
echo "Line 1"; echo "Line 2"
- & - execute the next program simultaneously.
echo "Both of these programs" & echo "are run simultaneously, whichever ends first will display first."
- && - wait until the program ends, and execute the next - only if the prior program ends successfully, or statement is True
- || - wait until the program ends, and execute the next - only if the prior program fails, or statement is False
The above can be used to create long chains of programs, that has a planned output on failure. For example:
Beyond a certain point of complexity, you may prefer to use fully fledged if-elif-else statements. For example:
fi ends if statements, as such done ends loops, and esac ends case statements. Case statements and For/While/Until loops exist here as they do in other languages.
However, the above can also be shortened, using the "||" from earlier:
You can also see my use of "-z". "-z" here means empty, and likewise "-n" means non-empty. "-lt" means less than, "-ge" means greater than or equal to, etc. Running "man test" on your Unix-like system will give you a longer list of some of the other conditional expressions available to you, which I won't go into any further detail here. Depending on what shell you are using (if you don't know, likely bash) you can run "man bash", which may also give you even more depth.
And now, the actual tools
This intends to be a list of useful programs to understand, or at least be aware of, on a GNU system. Awareness of the tools at your disposal lends itself to coming up with more ideas of how to use them in conjunction with each other. I suggest these ones, because in any script I write they tend to make up 90% of anything I happen to be using.
- echo / printf
For all intents and purposes, these programs will work fairly similarly, although you may intuit some key differences the more you use them. "echo" returns input you give it - this can be as strings (in quotes or otherwise) or variables with a value. "printf" also works with strings and variables, although it does not have added formatting like carriage returns after showing output.
"cat" is technically used to concatenate two or more files. It can also print the contents of a single file to stdout - which can go to the screen or another program, which it often sees use for.
"grep" can perform regex pattern matching on its input, and return matches. This also applies to returning the lines of a file which have a match. It can also check for multiple matches, OR/AND logic matching, and reverse matching.
- head / tail
"head" and "tail" can print out the first or last few lines of a file to stdout, using the format "head -n 3 file" - for example, to get the first 3 lines of a file.
"cut" can cut input based on a given delimiter, and print a field after cutting. For example, to output the third value of a comma separated list: "cut -d , -f 3 list.csv"
"awk" is a far more powerful version of cut, often preferred because it can use multiple characters in a string as its delimiter, print multiple fields and lines algorithmically, even performing conditionals and calculations. It's basically a scripting language in its own right, and worth investigating further.
"curl" - "connect URL" is useful as it can print the entirety of a webpage to stdout. This can be used to check for recent changes in a webpage, or with the "-o" or "-O" options cache it for use in a script - as you probably don't want to perform a connection multiple times throughout your script to the same page.
"sed" - "stream editor". sed is very useful for finding and deleting, or finding and altering, matches and lines in a file. It tends to complexity (although not as much as awk) and is worth investigating further for a variety of edge cases. You may be familiar with its format if you've used the Vim editor much.
There are many more programs you could use in your shell script - anything you can run in a terminal likely counts (and it's part of why anyone sensible will tell you terminal programs are in almost all cases more powerful, of course including the elitism factor), although the above will likely account for a large bulk of it.
Some honourable mentions
cron - cron jobs are used to automate the running of programs, and I find myself using them often. Their format allows you to run scripts based on minutes and hours of the day, days of the week, months of the year, or on any specific minute in time possibly weeks or months away. By creating a script, you now have a very specific, complex program - which you can now run at a very specific time(s) of your choosing. The amount of power this offers should be pretty plain to see. cron can even be set up to mail you if an error occurs - very useful if an important job on your server doesn't go through.
mutt - mutt is a terminal-based e-mail client, with an ncurses TUI interface for browsing mailboxes (and offers a lot in terms of writing macros, which is great). It's important, because after the rigorous set-up, it also allows you to send mail directly from the terminal without needing to open the client. You still want to automate those death threats, right?
If you have some dumb manual task you're spending half an hour clicking through, where a robot could do it, go script it now. Save some hours of your life.
Getting good with shell scripting, as with anything else, generally arises from practise and necessity. After making something, this post will hopefully seem more coherent. Hope you enjoyed whatever this was.