Time Machine Slowdown Issue and Resolution

Problem:

Right after coming back from the holidays I noticed that my machine was completely unusable when Time Machine (TM) would run (on Yosemite). It was so bad that I would need to pull the external USB hard disk without ejecting any partitions (while cringing inside) to be able to get control of my machine.

My system’s specs: mid 2014 MacBook Pro 15″ with a 2.5 GHz i7 CPU running Yosemite.

Solution:

This solution worked for me but the usual disclaimers apply.

Initially, I ran TM overnight thinking it had to catch up on some holiday weeks that it missed in terms of backups. This assumption was wrong. When I looked at the 1 TB partition that I had made available for TM it had only 25 GB left. It seemed like TM was thrashing my whole machine in attempting to clean up old backups.

The next thing I did was to shut TM off so that I could use my system while figuring how to delete old backups off the TM partition. I first tried to delete individual backups and I found this Stack Exchange article to be extremely useful: http://apple.stackexchange.com/questions/39287/how-can-i-manually-delete-old-backups-to-free-space-for-time-machine.

I tried both the command line approach of deleting specific TM backups and I also used the TM interface. The biggest problem was that I couldn’t tell which backups were extremely large (running ‘du’ on the directories was useless due to permission issues and a long response time). Additionally, when I used the TM interface, it would block me from using my system for anything besides Time Machine (command line was much better). I then decided to delete all the backups using command line and I got tons of error -36 messages. So this didn’t work well.

My solution was the nuclear option – i.e. nuke the TM partition and start over:

  1. Shut off TM via system preference
  2. Disconnect TM partition via Finder
  3. Use Disk Utility to erase partition:
    • name it with current year so it’s different name than original
    • it will complain giving an error while it removes the encrypted partition that TM created
    • re-erase partition after the initial error so actual erasure occurs
  4. In TM system preference:
    • remove the old disk (you can’t do this until the partition is gone)
    • create new TM disk by selecting the new TM partition

This is probably old hat to experienced users of the Mac but it was new to me.

If you found this useful – let me know via @eli4d on Twitter

Book Review: Neptune Crossing (The Chaos Chronicles Book 1) by Jeffrey A. Carver

Note:  This post contains affiliate links to Amazon.

Spoiler Free review of Neptune Crossing (The Chaos Chronicles Book 1) by Jeffrey A. Carver.

Review

Rating:

  • Harlequin level: n/a
  • Plot/action/story: 5
  • Solid conclusion: 5
  • SciFi thrill: 4
  • Fantasy thrill: n/a
  • Part of a series but doesn’t skimp (as applicable to this book): 5

Overall thoughts about the book

I’ve decided that for this year, I will endeavor to do quick reviews that are spoiler free.

If there’s one thing that has allowed me to read a ton of books (as in 10-15 books) last year it was my purchase of the Kindle Voyage.  The reading quality and compactness of this device has been amazing.

Before I talk about Neptune Crossing I should mention a couple of things.  First of all, I found out about it through BookBub.  I used to get many of the free books that BookBub suggested.  However, after reading a few duds, I’ve become more careful about my choices.  These days I look at the reviews (especially the negative ones) to see if it’s worth reading.  It certainly has become more difficult to find good books from new authors (just like app selection on the App Store).

My first introduction to Jeffrey A. Carver was Panglor which I got through BookBub (Neptune Crossing came through the same route).  I really tried to read this Panglor but the character was so exhaustingly trite and without any redeeming qualities that I gave up on the book fairly quickly.  If I need whining, I need to look no further than real life humans.  Why would I spend delicious reading time on whining?

I was in between books in terms of the Kindle Owner’s Library (once per month you can borrow a book), when I decided to read Neptune Crossing since it was in my Kindle library.  Thankfully, Neptune Crossing was nothing like Panglor.  I should also say that I had no background about the author when I read both books (not that this really matters…if a story is bad, then it’s bad regardless of the author’s fame and other books).

It was slow going initially (first 70 pages or so) and I didn’t like John Bandicut, the main character.  But John was sufficiently ‘real’ to see me through the first part.  The other thing that bugged me about the first part is that the initial alien is rewritten after the first 70 pages or so which really bugged me.  In the afterward, Carver mentions that he changed point of view when writing the book and had to rewrite the whole beginning.  This may be why the beginning was disappointing.

The other thing is that Carver seems to be obsessed with the phrase “hooked his thumb”.  He uses this throughout the book and it actually took me out of the story because it’s somewhat of an unusual phrase for me.  I recognize that it’s a minor nit picky thing to mention but it was a minor thing that made an impact on the book’s readability for me.

So the first third sucked a little bit.  But the last two thirds of the book took off just like the spaceship that is described in that part of the book.  I couldn’t put down the last part of the book even if the last few pages took a weird “2001 A Space Odyssey” (http://amzn.to/1RvFr9O) turn where everything became odd and weird and a setup for the second book.

The writing was very good and very descriptive.  No minor editorial errors to take you out of the story.  While the character is not fully likable, he is ‘real’ which is what redeems him.  To be truthful, I really don’t like John Bandicut through the whole book, but he plays the role of the reluctant hero well.  He is a regular guy with regular abilities.  Heck, he’s a regular guy with some short-circuited regular abilities.

The story ended up being pretty good with a good mixture of solid scifi technology, chaos theory and lots of action.  The last 20 pages were kick ass and I couldn’t help but finish the book.

Lastly, the book stands on its own regardless of other books, so kudos on that.  I certainly didn’t feel the need to read any more of The Chaos Chronicles books to feel a sense of closure and satisfaction with this book.

Some nicely described sentences/phrases from the book (a few among many):

Before he could ask, he felt a sudden sense of memories falling into place like the tumblers of a lock…

The solar system was a vast, cold, dark, and lonely place and he had just set course for himself across its enormous emptiness.

He imagined the planets gathered around, watching and applauding as he smashed straight into the ___, and he wanted to look right for the event.

…but his thoughts were like chunks of ice in a packed floe, vibrating with energy, but too jammed together to move.

Fullstack Radio Podcast Episode with DHH – shaping your technical patterns based on your organizational patterns

On the Fullstack Radio Podcast this week there is a great technical/design discussion with DHH about technical versus organizational patterns, Basecamp 3 and Ruby on Rails 5. Sadly there was not enough cowbell in the form of curse words (only around 5 🙂 ). I kid around about this but one of the great things about DHH is his opinionated and eminently pragmatic approach. He justifies his reasons really well and he stands his ground regardless of the sh*t storms that stir up around him.

Beyond all the technical choices and decisions for Basecamp 3 the discussion that caught my attention was the one about technical patterns versus organizational patterns (starting around 09:19 and ending around 18:45. Most outlets of technical information (whether high profile developers, companies, etc…) focus on architectural patterns and there’s never any talk about organizational patterns. In other words, does the architectural pattern that you choose fit your organizational pattern?

DHH discusses the intersection between organizational patterns and technical patterns. For a small team (like Basecamp’s) of 12 developer/designers a micro-services architecture would be disastrous in terms of implementation and maintenance. Whereas micro-services might be a perfect fit for an organization like AWS. In the case of Basecamp 3 the organizational pattern (i.e. very small dev team) causes the following choices in architectural patterns:

  • hybrid native apps (i.e. do as much on the server as possible with fast web views while doing native side optimizations for high fidelity features)
  • Basecamp 3 as a “majestic monolith” rather than a constellation of micro-service (11:08)

The point is that you have to fit your technical pattern to your organizational pattern, not the other way around. The question fundamentally is: “does this technical pattern fit our organizational shape?”

Best quote of the episode: “whatever Facebook is doing do the complete opposite of that and in many cases you’re closer to finding patterns to your organizational shape if you’re a company of 5, 10, or under 50” (13:50). Basically, trying to clone architectural patterns of companies with unlimited resources is a very bad approach.

Micro-services make complete sense for someone like Amazon (15:05). Amazon has lots of people and lots of business units. Amazon was an early adopter of service oriented architecture. Team sizes are what they are at Amazon but you need teams to collaborate, so micro-services fit this model.

The majestic monolith has wrongly been discarded because (second best quote) “people have been looking at giants for inspiration for ants” (17:42).

This is a very interesting approach. I never thought about it in this way always looking at the technical patterns. I’ve been at start-ups where the software architect is so focused and in love with technical patterns that s/he loses all perspective of anything else. In fact, I don’t recall any start-up where the organizational pattern shaped the decisions of the architectural pattern.

For RoR enthusiasts there are lots of Basecamp and RoR 5 information beyond the above section.

More DHH info can be found here:

Notes on installing an FTP server on a Digital Ocean virtual machine running Ubuntu 14

Overview

These are some quick notes/lessons related to vsftpd installation on Ubuntu 14. My reason for creating such a server was that I wanted to collect photos for an event (from the guests that came to the event). I had originally thought this was going to be easy with my 1 TB Dropbox account. What I didn’t realize was that in order for anyone to upload to a shared Dropbox folder, that person has to have an account on Dropbox.

So rather than hassle people about creating a Dropbox account, I figured that a temporary FTP server through Digital Ocean* would be easier. While I deployed the server and got it working for my needs I later realized that I was trading the ‘you need to create a Dropbox account’ hassle with ‘you need to upload using a FTP program’. I realized that this was a bad approach too since I was dealing with users that had a wide (wide) range of technical comfort and knowledge.

* Note that my Digital Ocean links in this post are referral links – they’re a great service which I really like and I definitely recommend.

Creating a virtual machine on Digital Ocean

Creating a virtual machine (i.e. a ‘Droplet’ per Digital Ocean’s jargon) on Digital Ocean (DO) literally takes 55 seconds (which is pretty amazing). DO’s support center (https://cloud.digitalocean.com/support/suggestions) walks you through clear instructions on doing this.

I went with Ubuntu 14,04 because it is an LTS version and was likely to be quite stable. Of course I didn’t need long term support for such a short-lived server but I figured the stability would be worth it.

Creating a virtual machine on Digital Ocean

SSH Keys

DO will email your root password or you can create SSH keys and put the public one on your instance for easy log-in.

I used the https://www.digitalocean.com/community/tutorials/how-to-set-up-ssh-keys–2 instructions for ssh key association with my droplet. This line from the instructions did not work for me:

 cat ~/.ssh/id_rsa_digitalocean.pub | ssh user@123.45.56.78 "mkdir -p ~/.ssh && cat >> ~/.ssh/authorized_keys"

So I ended up destroying and re-creating my droplet and pre-associating the public key that I had just generated. Since it’s really fast, there was no big negative in doing it this way. Of course I could have used scp to the copy of the public key too if I didn’t want to re-create the droplet.

Installing and configuring the FTP server (vsftpd)

Installing vsftpd

I found pretty good instructions on https://www.digitalocean.com/community/tutorials/how-to-set-up-vsftpd-on-ubuntu-12-04 for the initial installation.

The key is to install vsftpd and configure /etc/vsftp.conf:

 apt-get install vsftpd

When looking at vsftpd’s configuration – Vim drove me a bit batty with the built-in color syntax-ing (tons of dark shades of unreadable color) and I had to turn that off. The instrutions at http://vim.wikia.com/wiki/How_to_turn_off_all_colors explained how to do this (just put these at the end of the .vimrc):

 syntax off
 set nohlsearch
 set t_Co=0

550 error

My initial run of vsftpd per the tutorial that I found yielded a 550 error. This was one of a cavalcade of errors when testing different vsftpd configurations. The long and short of it is that the ftp server can be configured in different ways (anonymous download only, download and upload, etc…). Each of these possibilities yields different permutation of options in /etc/vsftpd.conf and the potential of other supporting files (for example – virtual users need more configuration files).

My configuration goal was a single user that could upload files to his home directory. This was going to be a shared user among different people that attended the above mentioned event. My assumption was that each would put their photos in a sub-directory that I created for them (see “Conclusion” section of this post for why this was a poor assumption).

So…I needed a chrooted ‘regular’ user for this configuration. Below is my final /etc/vsftpd.conf configuration and here are some useful sources of information.

550 error

Creating the ftp user – 1

I created the user using:

 useradd ftpuser

How-To Geek has a good article about useradd. Ubuntu also has an adduser command too. Both do the same thing but I found useradd to be easier to use.

After creating the ftpuser I decided to give my ftpuser a brilliantly simple password and it was ftpuser.

 passwd ftpuser

My intent was to make it easy on my users. This was a fatal (and dumb) security mistake. I am well versed in the stupidity of security by obscurity and I fell for it thinking that ‘no one is going to find the ip of this droplet’. I cover this lesson in the “Conclusion” section of this post.

Creating the ftp user – 2

One initial issue with my user and vsftpd was this error:

 500 OOPS: vsftpd: refusing to run with writable root inside chroot()

The problem was that ftpuser’s home directory didn’t have proper permissions for chroot to work correctly. Basically, the home directory of ftpuser cannot be writeable but sub-directories need to be writeable. So I did the following:

 As ftpuser within ftpuser's home directory:
 ftpuser@myawesomedroplet:~$ chmod 755 ../ftpuser/
 ftpuser@myawesomedroplet:~$ mkdir _test
 ftpuser@myawesomedroplet:~$ chmod 555 ../ftpuser/
 ftpuser@myawesomedroplet:~$ touch test
 touch: cannot touch ‘test’: Permission denied
 ftpuser@myawesomedroplet:~$ exit
 As root:
 root@myawesomedroplet:/home/ftpuser# service vsftpd restart

The _test directory is where I would have my logged-in user put their photos (well something better than _test)

For more info on this see: http://askubuntu.com/questions/239239/500-oops-vsftpd-refusing-to-run-with-writable-root-inside-chroot-keep-user-j

Some insecurities

Everything looks good but…

After the above configuration for both vsftpd.conf and my local user I was all set. I tested logging-in via an ftp client, changing to _test and uploading a file and it all worked swimmingly. Then the next day I tested the exact same thing and I couldn’t log into the ftpuser account. I changed the password back to *ftpuser* and in 24 hours the exact same thing happened.

Well maybe it’s a security patch thing

I thought that perhaps my system wasn’t sufficiently patched (the magical thinking trap kicking in). So I went ahead and patched it. I also used the script from https://www.digitalocean.com/community/questions/updating-new-ubuntu-droplet-security-updates to make it easier on myself.

Chris Fido in his Servers for Hackers has an even better approach to get automatic security patches using cron and the specific Ubuntu distribution.

Nope it’s not a patch thing

My ftpuser kept being inaccessible after a few hours passed since changing its password to my brilliant password of ftpuser. So I decided to ask my question on askubuntu.com:

http://askubuntu.com/questions/691375/on-ubuntu-14-04-3-something-is-changing-regular-users-password-within-24-hour

Nope it's not a patch thing

Vincent Flesouras rocks!

I got a fantastic answered from a gentleman named Vincent Flesouras.

The short answer: security by obscurity doesn’t work. I feel like Bart Simpson at the black board repeating this sentence over and over again.

Vincent Flesouras rocks!

Next action

The next step would be to throw away my Digital Ocean droplet and re-create it with something like Ansible. Since Digital Ocean charges me based on an the existence of an instance (whether it’s online or shut off it still costs), this would also save my some money and create a repeatable virtual machine.

I stopped here because I realized that the FTP server approach was the wrong approach for my audience. I think a better solution would be a webserver approach for easy upload of files (perhaps Caddy with some Golang goodness) but this will have to wait for another time because I’m out of time.

Conclusion

I learned the following lessons:

  1. Before diving into something, make absolutely sure you know how your least technical user will use your product/creation/monstrocity. I had assumed the built-in pervasiveness of ftp clients within all web browsers. The problem is that this is true but in the wrong direction for my use case. Most web browsers can connect to an anonymous ftp server to download files not upload them. Of course there are plenty of web based ftp clients, but then I’m giving a third party access to this ftp server with personal items (i.e. photos) on there. So…an ftp server was the wrong solution for this problem to begin with.
  2. Never ever ever use a super simplistic password relying on the obscurity of your server (i.e. ‘just’ an IP without an associated domain). Rationally, I knew this to be the case but there’s nothing like your brand spanking new user account on a brand spanking new virtual machine changing passwords ‘by itself’ every day.
  3. I should have used Ansible or some such orchestration software for the creation of the server. It would have allowed me to quickly and cleanly destroy/create server instances. This would have helped with testing of my server’s configuration (security and otherwise).

This was definitely a learning experience both about vsftpd and security.

QP: Shout-out to High Sierra and the best customer service I’ve experienced in a very long time

So before you ask – no this is not a sponsored ad or any such thing. This is about a great customer service experience which is somewhat unusual these days.

I just received my brand spanking new High Sierra backpack after contacting them (https://shop.highsierra.com/contact-us) about their lifetime warranty. The long and short of it is that I’ve had this High Sierra backpack for many years and I’ve used it every day. Recently it ripped and I decided to shop around for a new backpack. I came across some Amazon reviews (and responses from High Sierra) that talked about their lifetime warranty. So I decided to give the warranty a shot.

High Sierra was amazing. I contacted them. They asked for some pictures of identifying tags of the backpack and bam – new backpack on the way. In all honesty I haven’t experienced this sort of customer service (for a physical good) in years. I would have completely understood if they rejected my request (did I mention that I’ve used this backpack every day). But they didn’t. They sent me a new backpack and they cemented me as a loyal customer.

Thanks High Sierra!

QP: Clarify – best screenshot program is available for %50 off

I have used Clarify (http://www.clarify-it.com/) on an almost daily basis. I’ve tried lots of different screenshot programs and Clarify is really the best. If you’re looking for a rapid way to document anything with words and annotated images – then Clarify is your program. It exports to many different formats including Markdown. All of my tutorial posts that have images are done on Clarify. I can’t say enough good things about it.

QP: More Vue.js goodness on the Changelog podcast including a great discussion about the nature of open source and personal projects

Yet another great discussion about Vue.js on the Changelog podcast (http://5by5.tv/changelog/184). Whereas the Fullstack Radio interview (https://eli4d.com/?s=Fullstack+vue.js) focuses on where to begin with Vue.js, the Changelog’s interview focuses on Vue’s origin as well as its comparison with other frameworks.

One additional aspect that this episode touches on is the nature of open source and the ‘line’ between a personal project versus an enterprise worthy one. This discussion begins around 55:40 minutes into the podcast. For anyone trying to persuade your company/management about the merits of using an open source project (whether Vue.js or otherwise), this discussion is for you. I especially like Jerod’s (https://changelog.com/author/jerodsanto/) preceptive comments about assumptions of support and laziness when it comes to open source.