This world has been connected aka “this site is live again”

As you can guess this site is back. I had a couple of hiccups with SSL as the renew in certbot wasn’t working correctly. I didn’t sweat the issues but it did take me a while to figure out why the site wasn’t loading. Turns out that I had a plugin for wordpress that enforced https thus making me nginx redirect to the https.

I like https. But seeing that it has become a bit of a hassle to maintain it and I only blog on a monthly basis I don’t see why I should consider SSL anymore.

I ended up mass updating a lot of stuff in the database to clean out any https reference that belonged to my site. It turns out that storage.thehumble.ninja has been offline for 2 months now. There’s a quick fix coming for that.

Site updates: Done, and done.

I finally finished moving everything off Microsoft Azure. Using Azure made me realize that as much as I wanted to use it it was just a huge money sink for what I was going to use it for. Over the days that passed I was just pondering whether or not I should stay with Azure. It didn’t sit well for me paying additional fees for Bandwidth, disk performances (reads, writes, premium, standard), and other types of details.

I hope that in a near future Microsoft Azure offers a B-series virtual machine with the capability of reserving bandwidth capacity. It’s a much needed feature for customers that have small or medium sites. I know that most of Azure is managed, as in, if I open a ticket the standard support is supposed to do the work and investigate what’s going on. I know it’s not profitable assigning so many resources to support small/medium customers when you want to keep that response time low for enterprises.

I hope that in the following days I have the time to write a long winded post about the cloud and the current prices. In fact I’m hoping to talk about Linode, Digital Ocean, Scaleway, and other services where I spent my time doing setups.

Now, having said all that. I ended up in ArubaCloud. I thought long and hard about it. I gathered that many people didn’t have problems with them. I’m actually excited because not only I got a low cost out of it but I can now create actual affordable virtual machines based on current needs: Do I need an e-mail server? Let me spin up a VM. Do I need an additional SQL Server? Let me spin up a VM and see if I can even out the current load.

I ended up creating a setup I really liked. For a long time I wanted to have SQL Server separated from NGINX/Apache, and with ArubaCloud that was made possible so now I have a dedicated SQL server serving this site and a HTTP server (nginx) serving all dynamic/static data. I loved working with UFW, setting up the firewall, fail2ban, etc. I think if I have to put an order out there it would be like this:

  • Spin up a VM in ArubaCloud with Ubuntu.
  • Notice that it doesn’t have the latest Ubuntu, but that’s okay with Ubuntu Xenial I can jump to 18.04.
  • Jump to do-release-upgrade -dand that will guide you through the process.
  • Once upgraded, which shouldn’t take you more than 30 minutes, apply security settings to sshd_config and add the rules I need to protect my VMs with UFW which is a tool to simply firewall management.
  • Install fail2ban, change SSH port and so on.
  • Configure server roles (DB, HTTP Server in my case)
  • Install LetsEncrypt’s amazing certbot.
  • Generate certificates for your site and be sure to enable SSL on your virtual host.

And the steps goes on and on and on. It looks tedious, and sometimes it is. I enjoy setting up my environments. After all the configurations were done?

I had a few hiccups from the MySQL Server. I wasn’t getting a decent response time, I think it was a network issue because as I’m writing it the response times have improved greatly.

There’s still a few security enhancements I have left to do, but they aren’t exactly priorities. I feel incredibly accomplished with my little journey on configuring my first remote MySql server and making it work with the HTTP server. At first sight it isn’t hard, but as you start considering security things become a bit harder.

 

TheHumble.ninja now with SSL

I’ve made some changes to thehumble.ninja as I plan to restructure to site and start purging content here and there. There’s an article I’ve been wanting to write about and it’s about saving costs with Azure, the cons and pro of using cloud services, and how scary it can get if you decide to use cloud services like Azure, Google Cloud, etc.

With that in mind I’d like to emphasize that I do like Microsoft Azure and would love to use it without the constant fear of overage charges, but that’s another subject that won’t be discussed here at all. Now, going back to Azure I’ve read two articles from Scott Hanselman where he goes over demonstrating how to use Azure and deploy cheap containers.

In an ideal world, I would have supported every word he said. Using containers is amazing, wonderful, and just plainly awesome. It gives you that control of isolating services (mysql, httpd, mail, etc) into separate containers and you can cram as many, MANY, applications in your app service plan.

But I can’t simply support it and it’s honestly for a very silly yet incredibly harmful reason that I can’t agree that Azure is cheap. It’s harmful for anyone that wants to run a personal project, site. If you have disposable cash and have never in your life budgeted for a single thing then Azure is for you.

Bandwidth is my biggest concern. Not just for Azure, but for any cloud service. I think Azure VMs are decently priced and competitive, heck I even thought of paying a reserved instance myself for this site (well, many sites hosted in it). As of today, 1TB is $88.65 USD. If that’s not expensive for you then sir, by all means go for Azure as I won’t stop you. But an average joe with an average job like me who just wants to write, and deploy personal projects to the web? 88 bucks is too much + all calculated prices on top of it.

My suggestion to Azure team? Include bandwidth packages in App Service Plans, offer reserved instances to containers/app service plan and I’ll be more than happy to subscribe for the years to come (as long as the prices are reasonable). And it doesn’t have to be 1TB exactly. I think we all need that safety net most service providers offer with VPS and we don’t have that in Azure.

Why? Imagine an scenario that an individual is targeted. See, if the person gets DDoSed Azure has basic protection and I’m sure it can withstand any attack. More so if you have Cloudflare as your front and you keep a good chunk of malicious individuals out. But, hey, the malicious individual just found out that for some reason you are using cloud services to host your site and decide to download 1 million time a 100 megabytes zip archive you offer. That’s 100 TB bandwidth down the drain alone, and I doubt that Azure will throw the towel and say “it’s ok we understand you were targeted and attacked. So we will invalidate the bandwidth usage”.

And maybe my example is overly exaggerated, but my point is even if you aren’t attacked, and you have a medium sized site with 1TB bandwidth usage I highly doubt anyone would pay $88.65 when Digital Ocean, OVH, even Amazon with Lightsail gives you that bandwidth cap at a lesser monthly price. I get it. They are overselling bandwidth. Any service provider will probably monitor your VM and try to assess if it’s getting abused or that’s just the normal bandwidth usage of the server. If it is? Great, carry on, there’s no abuse involved. Most service providers won’t care in the long run because they have so many customers that use at the very least 3-4GB of bandwidth and it’s expected they will never reach 400GB bandwidth as it’s just a bunch of personal sites, etc. Now, if all their customers used 1TB exactly I guess they’d be running at a deficit. I honestly don’t know much of the deals involved with data centers and network usage and there are better people specialized in this sort of stuff than me.

In conclusion, because I never meant to write a post this long. As you can see, I want to use Azure, but Azure is a big threat to my wallet when it comes to bandwidth. Do keep in mind that my thoughts on Azure are going to be a larger post than this, but this is one of the issues that I really needed to throw out there to the public.

As for the site. It’s temporarily hosted in a Azure instance until I decide whether to stay or not. I highly doubt I would stay considering the bandwidth concern. I don’t use much bandwidth but I know sometimes it’s a good 10GB that is used, that doesn’t eliminate the concern though.

 

Site updates

A few weeks ago I wrote about completely stopping creating content for thehumble.ninja. I realized that I wanted to keep creating content and be more active in different communities, share what I have discovered and enjoy doing the process of content creation.

This site will stop being about my life, programming, and career related stuff. The content I mentioned just now will be moved to a different domain while I repurpose thehumble.ninja to focus on different entertainment mediums: anime, manga, games (and keep alive linux gaming), and tv series. Part of me already think it’s a good idea to keep two different things completely separated and at the same time I wonder if it would lean itself to be more of a “reactionary” type of content where I react to things and they come. I’m sure I’ll come up with a way to even the process.

How often the site will be updated will solely depend on my free time. I’m not getting any younger, my life priorities have changed completely. I’ve had to leave communities and other things to achieve (usually) a short term goal or just move on with everything else going. What I’m trying to say is that while I’m happy I’m doing this I’d also like to focus on other aspects in my life like losing weight, traveling to other countries and well, see other things.

 

Looking for a new home

Recently received an e-mail by Digital Ocean that they will be removing the credits soon so that they can re-purpose the servers. The decision mostly affects students (like me), so soon enough I will be making a backup of the compressed archives I have published here. As for this blog? I’m not sure where to put it… I certainly don’t have the money to put out there right now just for one blog.

Repositories

If by any chance you click whatever github link related to my account I have to say that I’ve removed myself from Github and now use BitBucket as my go-to needs for “backing up” and collaborating. There’s also GitLab but I have found that their interface and site is so… so slow, compared to BitBucket.

Translator available

I had to remove Bing translator… not because it’s bad but because the widget sucks so much. If a user uses the widget Bing provided on the following visit he/she will get a floating box and that box would hinder the experience of the user, the box pretty much follows you anywhere. I prefer Google’s dropdown as you just select it and it doesn’t become intrusive.

Screenshot from 2015-12-21 13:11:59

Cheers.

The quickest way (at the moment) of importing your Ghost blog to WordPress

To commemorate my sudden switch to WordPress I’ll give you a small hint on how to get your data exported rapidly through RSS.

All you need to do is open is edit [GHOST FOLDER]/core/server/models/post.js look for the text // Set default settings for options, you will see the limit is set to 15, alter that to “900000”.

Restart ghost server, go to your site http://mysite.com/feed and save as the feed. You will notice all your posts are there.

There’s a gotcha: No images will be imported (of course, this is a RSS file after all) and no tags will be imported (sounds fishy on behalf of RSS importer, maybe a bug). As to why I did it… well, let’s say I wasn’t satisfied with Ghost and its dashboard leaves more to be desired. I wasn’t planning to spend the next hours writing an importer because honestly I don’t have many posts, and the hours spent wasn’t going to cut it either.

Domain change.

Like I’ve mentioned in plenty of posts, the domain change has now taken place. Nothing has changed as I’ve assured that dgzen.pw remains relevant to wget users.

Content remains the same. If a link doesn’t work just shoot me an e-mail. It’s in the about me page!