Website – A collection of things that don't really matter https://www.riptor.com Mostly useless content from the Pacific Northwest Sat, 21 Nov 2020 18:50:23 +0000 en-US hourly 1 https://wordpress.org/?v=6.9 Updated https://www.riptor.com/2020/11/15/updated/ Sun, 15 Nov 2020 21:36:03 +0000 https://www.riptor.com/?p=165 Continue reading Updated ]]> I haven’t been doing a good job housekeeping my website. I recently discovered I was running a very old version of Ubuntu so it was time to dust off some linux skills and get crackin’ at porting my site to a new image.

One of the most tedious things about setting up a new site is making sure you have everything covered. I don’t have a script to automagically port one site to another. Instead, I’m going to dig through my bash history and capture everything I just did here.

On the new machine:

$ sudo mkdir /proj
$ sudo mount /dev/xvdf /proj # I have a virtual disk 
$ sudo adduser user
$ sudo usermod -aG admin user
$ sudo usermod -aG sudo user
$ sudo apt upgrade
$ sudo apt-get install apache2
$ sudo apt-get install php libapache2-mod-php php-cli php-mysql php-gd php-imagick php-tidy php-xmlrpc
$ sudo apt-get install mysql-server mysql-client
$ sudo systemctl start mysql
$ sudo mysql -u root 
> create database nameofwpdb;
> create user 'wpuser'@'%' identified by 'password';
> create user 'wpuser'@'localhost' identified by 'password';
> grant all on nameofwpdb.* to 'wpuser'@'%' with grant option;
> grant all on nameofwpdb.* to 'wpuser'@'localhost' with grant option;
> flush privileges;

On the old machine:

$ sudo tar czf ~/tmp/www.tgz /www
$ sudo tar czf ~/tmp/etc.tgz /etc
$ sudo tar czf ~/tmp/home.tgz /home
$ mysqldump -u root -p nameofwpdb > ~/tmp/nameofwpdb.dump
$ scp ~/tmp/www.tgz user@newhost:/home/user/
$ scp ~/tmp/etc.tgz user@newhost:/home/user/
$ scp ~/tmp/home.tgz user@newhost:/home/user/
$ scp ~/tmp/nameofwpdb.dump user@newhost:/home/user/

And back on the new machine:

$ mysql -u wpuser -p nameofwpdb < ~/nameofwpdb.dump
$ mkdir ~/unpack ; cd ~/unpack
$ tar zxf ../www.tgz
$ tar zxf ../etc.tgz
$ tar zxf ../home.tgz
$ # put the home directory back in order
$ mv home/user/.bashrc ~/ ; mv home/user/.profile ~/
$ mv home/user/* ~/
$ sudo mkdir /www
$ sudo chown www-data:www-data /www
$ mv www/* /www
$ sudo chgrp -R www-data /www/
$ cd /etc/apache2/mods-enabled
$ sudo ln -s ../mods-available/rewrite.load .
$ sudo ln -s ../mods-available/socache_shmcb.load .
$ sudo ln -s ../mods-available/ssl.conf .
$ sudo ln -s ../mods-available/ssl.load .
$ cd ../sites-available
$ sudo cp ~/unpack/etc/apache2/sites-available/* .
$ cd ../sites-enabled
$ # I did some symlinking of my sites here
$ cd ..
$ cp ~/unpack/etc/apache2/IPList.conf . # this is my ip block list for my sites
$ sudo mv ~/unpack/etc/letsencrypt/ /etc
$ sudo chown -R root:root /etc/letsencrypt
$ sudo apt-get install certbot
$ sudo hostnamectl set-hostname --static pokemonname
$ sudo vi /etc/cloud/cloud.cfg # set preserve_hostname = true
$ sudo reboot
$ sudo apt-get install python3-certbot-apache
$ sudo certbot renew --dry-run

And lastly, a few links I used along the way:

https://aws.amazon.com/premiumsupport/knowledge-center/linux-static-hostname-rhel7-centos7/

https://dev.mysql.com/doc/refman/8.0/en/creating-accounts.html

]]>
Going Secure https://www.riptor.com/2018/03/27/going-secure/ Wed, 28 Mar 2018 04:01:14 +0000 https://www.riptor.com/?p=139 Continue reading Going Secure ]]> I setup an SSL-enabled Apache server on Lightsail today in 15 minutes. I would have spent less time on it if I had gone directly to Let’s Encrypt, but I’ll share the steps in case anyone wants a quick-and-easy SSL-enabled web server. I’m using Ubuntu 16.04.3 LTS so… you’re mileage may vary.

  1. Add the certbot repo:

    sudo add-apt-repository ppa:certbot/certbot

  2. Update:

    sudo apt-get update

  3. Install the certbot:

    sudo apt-get install python-certbot-apache

  4. Modify your site conf for SSL:

    sudo vi /etc/apache2/sites-enabled/000.example.conf

    ..
    <IfModule mod_ssl.c>
    <VirtualHost *:443>
    ServerAdmin webmaster@example.com
    ServerName example.com
    ServerAlias www.example.com
    ..

  5. Run the certbot:

    sudo certbot --apache -d example.com -d www.example.com

After that you agree to some terms, answer some questions (my shortcut for this is A, N, Enter, 2), and BOOM! You have an SSL-enabled web server too! You might not even need steps 2 and 4 but I did ’em anyway and it didn’t hurt.

]]>
Up and running with Lightsail https://www.riptor.com/2018/02/04/up-and-running-with-lightsail/ Mon, 05 Feb 2018 04:14:20 +0000 http://www.riptor.com/?p=135 Continue reading Up and running with Lightsail ]]> It happened yet again… I went back to the billing console in AWS and I realized my reserved instance was no longer reserved… my bill was double. Yay.

So I opted to check out Lightsail. It’s an AWS service that makes virtual private servers on the cloud easy. I opted for a relatively small instance to run my website and now I’m only paying $10 a month. I went with an Ubuntu image and it was easy to move all my bits from my previous reserved instance over to Lightsail. Now I’m not going to worry about my monthly bill surprising me next time!

]]>
Slow progress https://www.riptor.com/2017/11/28/slow-progress/ Tue, 28 Nov 2017 15:42:25 +0000 http://www.riptor.com/?p=133 Continue reading Slow progress ]]> I finally started to look at Node.js. I’ve never truly enjoyed JavaScript but the new spec makes the language feel pretty modern and there are plenty of editors out there that make editing a breeze.

I decided to stand up a simple server with a NoSQL solution behind it. This was very simple. I used tutorials from the Node.js site as well as a few from w3schools. Before I knew it I was sending my metric data from my dungeon platform game to a server that captured the metrics and shoved ’em in a collection.

I still need to do something with the data but I’m making progress again. The next step is to handle some auth (or a simple challenge for the client / server) and some input validation to ensure what I’m reading is meant for the service. I also haven’t quite worked out how to gather the data out of mongo and make it useful but there’s always something else to work on.

]]>
Cleaning up the blog even more https://www.riptor.com/2016/03/01/cleaning-up-the-blog-even-more/ Tue, 01 Mar 2016 16:44:28 +0000 http://www.riptor.com/?p=65 Continue reading Cleaning up the blog even more ]]> I’ve been paying more attention to my log files lately. What I’ve found is not surprising, just disheartening. There are a number of requests by bots and script hackers to specific pages that have been regularly exploited in the past.

Aside from keeping my platform up-to-date (this includes Apache, Linux, WordPress, and so-on) I’ve been restricting access to these potentially unsafe resources and I’ve finally put together a script to automate denying compromised hosts access to the site.

The first important bit (again, besides keeping my platform up-to-date) is restricting access to xmlrpc. I did that with a simple .htaccess script:

# BEGIN protect xmlrpc.php
<files xmlrpc.php>
    order allow,deny
    deny from all
</files>
# END protect xmlrpc.php

The next thing I added (found with a little help from Google and stack overflow) is an IP list restriction. I just put this in every virtual host config:

<VirtualHost>
    ..
    <Directory /www/>
         Options Indexes FollowSymLinks MultiViews
         AllowOverride All
         Order allow,deny
         allow from all
         <RequireAll>
              Require all granted
              Include /path/to/IPList.conf
         </RequireAll>
    </Directory>

</VirtualHost>

(I’ve also just realized how limited pasting code into WordPress can be… but luckily I found this awesome nugget http://hilite.me/)

And the last bit is a script that will search my log files and tell me who is being naughty. It doesn’t update the exclusion list in place and it doesn’t restart apache – we’ll leave that as an exercise for the reader 😉

#!/usr/bin/perl
#

$threshold = 250;
$days = 7;

%iplist = ();

open(INF, "/path/to/IPList.conf" );
while ($a = <INF>) {
    chop($a);
    @fields = split(' ', $a);
    $ip = $fields[ @fields-1 ];

    $iplist{ $ip } = 1;
}
close(INF);

@URLS = ( 'wp-login', 'xmlrpc' );
@files = `/usr/bin/find /path/to/apache2/ -name '*access*log*' -mtime -$days`;
$filelist = join(' ', @files);
$filelist =~ s/\n//g;
print "searching $filelist\n";

foreach $urlfragment ( @URLS ) {

    @list = `/bin/zgrep -h $urlfragment $filelist | /usr/bin/awk '{print \$1}' | /usr/bin/sort | /usr/bin/uniq -c | /usr/bin/sort -rg | /usr/bin/head -n30`;

    foreach $ipdata ( @list ) {
	@fields = split(/\s/, $ipdata);
	$ip = $fields[ @fields - 1 ];
	$count = $fields[ @fields - 2 ];

	if ( $count > $threshold ) {
	    if ( $iplist{ $ip } != 1 ) {
		# new ip to add to the list
		$iplist{ $ip } = 2;
		print "adding $ip for requesting $urlfragment $count times over $days days.\n";
	    } else {
		print "skipping $ip, already in our list.\n";
	    }
	}
    }
}

open(OUF, ">/tmp/IPList.conf");
foreach $ip ( sort keys %iplist ) {
    if ( length($ip) > 3 ) {
        print OUF "Require not ip $ip\n";
    }
}
close(OUF);
]]>
Way too much traffic https://www.riptor.com/2016/02/24/way-too-much-traffic/ Wed, 24 Feb 2016 03:33:54 +0000 http://www.riptor.com/?p=25 Continue reading Way too much traffic ]]> I noticed the site was getting a lot of traffic and couldn’t figure out what was causing my sites to eventually become unresponsive. Since I have some free time on my hands I figured it would be a good time to look into it.

cat /var/log/apache2/*.access.log | cut -d'"' -f2,3 | awk '{print $4" "$2}' | sort | uniq -c | sort -rg | head

This resulted in A LOT of requests against xmlrpc.php. This is just my dinky little site but turns out there are a lot of script kids out there that like to hack up the xmlrpc.php to do their bidding. So I shut it off completely with an .htaccess directive.

And wouldn’t you know it… my apache2 processes haven’t spun up past 12 since I did it. I’ll get some updated photos up here soon too – I’d love to share some pics of the kids (and the fur kid) and maybe keep an actual running blog for a bit.

So entry first entry in over a year – complete!

]]>