Introducing The eli4d Gazette

Hello Friend,

Welcome to Issue 1 of the eli4d Gazette. This will be my way to keep in touch with former students and new friends that I have made through eli4d.com. My intent is to keep this short and sweet and pick some interesting things related to tech and non-tech. This will be delivered every two weeks, and it may change to a weekly delivery depending on how it goes.

I value your time and attention and I hope you find this to be worth your time. If you are interested, you can subscribe through the following url: https://tinyletter.com/eli4d


The eli4d Gazette

Issue 001: March 16, 2016

Tech Pick (JavaScript related)

Brendan Eich, creator of JavaScript, discusses his view of JavaScript’s direction at the Fluent Conference (click the ‘x’ to get past the “you need to login screen”). His message is the same as other years – “don’t bet against JavaScript”, but this year he added WebAssembly to the betting phrase. Apparently Ash from Evil Dead is his spirit animal (so the programming approach of jumping into JS and ‘hacking’ is built into the language’s DNA :-O ). Is Brendan right about JavaScript future? Who knows? He’s a smart guy but JavaScript is out of his and anyone else’s direct control. The battle lines are certainly being drawn in the mobile space between web apps and native apps (so far native has trounced web in terms of performance).

Edu Pick

I tried to provide a comprehensive approach to picking server-side software through my “Using the Boring / Old / Popular (BOP) criteria for server side software evaluation” article. It was geared towards beginners (developers and those that need to pick server side technologies), since experienced devs will have a “gut feel” and wont need such a numerical approach.

Podcast Episode Pick

Eric Molinsky created an amazing episode called “Why They Fight” where he connects superhero battles to D&D character alignments. I know it sounds ridiculously geek but once you listen to this episode, you will never again look at TV/movie/story heros/villains in the same way. If you’re a writer, the character alignment table may give you a new twist/angle on how you view/build characters in your writing.

How to Use your Amazon Prime No-Rush Credits

Overview

In this post I cover how to use Amazon Prime’s no-rush credits. This applies if you’re an Amazon Prime customer. I’ve gotten burned several times because the credit expired or when I tried to use it when it didn’t apply to the item that I was purchasing. I’m writing this post to remind myself how to do this and for anyone else that has wondered about this credit usage.

And to Amazon support: You’re welcome – feel free to extend my Prime Membership at your convenience πŸ™‚

It begins at the checkout screen

What’s that you say Amazon? Get $1 if I don’t use my Amazon Prime two-day shipping? Sure – why not.

It begins at the checkout screen

What’s that – get a $1 credit for a purchase of what item?

So what are those details?

What's that - get a $1 credit for a purchase of what item?

The “Details”?

So what are “select eBooks…”? It seems simple but nothing tells you exactly what you can purchase 😦

The "Details"?

So when you choose the no-rush shipping option…

Lets say that you love David McRaney’s podcast (the cookie eating segment is the BEST) but you don’t need the book right now. So you choose the no-rush shipping and initially nothing happens. You don’t get any information about the $1 credit until the book ships (which makes sense if you think about it – why give you the credit until your item is being shipped via the no-rush date).

When the book ships you get

When the book ships you get

Clicking on additional information once again

So here’s another explanation of the credit and what you can purchase.

But what can I purchase Amazon? I want to use that $1 wisely!

Clicking on additional information once again

So what does this mean?

Q: It sure feels like I can use this on whatever Amazon sells – right?

A: Wrong!

I ended up contacting support regarding this and I got the scoop, skinny, and explanation.

The EXPLANATION with a delicious Hunger Games example

A very nice Amazon support associate called me back when I asked for help through the website. I told the lady (lets call her Jane) that I purchased an ebook and my dollar credit didn’t kick in. Jane empathized with my frugality based sadness. She told me that the “credit only applies to items sold by Amazon Digital Services”. I asked her for an example. She told me to look up the “Hunger Games” books. She said that the key is the “Sold by” area. If that has Amazon Digital Services then the credit applies, otherwise you’re out of luck.

She told me that I should start any search with “Amazon Digital Services” and narrow my query parameters from there.

I thanked Jane for her clear explanation and help (marking the feedback email with AWESOME).

The EXPLANATION with a delicious Hunger Games example

Time to search for “Amazon Digital Services”

So first step is the general search query of “Amazon Digital Services”

Time to search for "Amazon Digital Services"

Lets narrow it down based on department

Choosing “Books” for example from the department drop-down.

Lets narrow it down based on department

Narrowing the department choice further (Books in this case) using the left-side choices

The left-side menu is THE way to narrow the search criteria within a department.

For the Books department I typically use the:

  • Type of book (1)
  • Book format (2)

    These choices are quite useful if you’re a Kindle book hoarder πŸ™‚ .

Narrowing the department choice (Books in this case) using the left-side choices

It’s time for some sweet Space Opera Kindle Books

Here’s an example of search narrowing using the left-side choices.

It's time for some sweet Space Opera Kindle books

Use the credit right away

To each his own of course but due to a fairly quick expiration date on the no-rush credits I suggest that you use the credit right away. Just bookmark your search query with your narrow criteria and you’re on your way.

Conclusion

So there you go. Maximum use of Amazon’s no-rush credit.

Enjoy!


Please let me know via Twitter (@eli4d) if you found this post useful.

Time Machine Slowdown Issue and Resolution

Problem:

Right after coming back from the holidays I noticed that my machine was completely unusable when Time Machine (TM) would run (on Yosemite). It was so bad that I would need to pull the external USB hard disk without ejecting any partitions (while cringing inside) to be able to get control of my machine.

My system’s specs: mid 2014 MacBook Pro 15″ with a 2.5 GHz i7 CPU running Yosemite.

Solution:

This solution worked for me but the usual disclaimers apply.

Initially, I ran TM overnight thinking it had to catch up on some holiday weeks that it missed in terms of backups. This assumption was wrong. When I looked at the 1 TB partition that I had made available for TM it had only 25 GB left. It seemed like TM was thrashing my whole machine in attempting to clean up old backups.

The next thing I did was to shut TM off so that I could use my system while figuring how to delete old backups off the TM partition. I first tried to delete individual backups and I found this Stack Exchange article to be extremely useful: http://apple.stackexchange.com/questions/39287/how-can-i-manually-delete-old-backups-to-free-space-for-time-machine.

I tried both the command line approach of deleting specific TM backups and I also used the TM interface. The biggest problem was that I couldn’t tell which backups were extremely large (running ‘du’ on the directories was useless due to permission issues and a long response time). Additionally, when I used the TM interface, it would block me from using my system for anything besides Time Machine (command line was much better). I then decided to delete all the backups using command line and I got tons of error -36 messages. So this didn’t work well.

My solution was the nuclear option – i.e. nuke the TM partition and start over:

  1. Shut off TM via system preference
  2. Disconnect TM partition via Finder
  3. Use Disk Utility to erase partition:
    • name it with current year so it’s different name than original
    • it will complain giving an error while it removes the encrypted partition that TM created
    • re-erase partition after the initial error so actual erasure occurs
  4. In TM system preference:
    • remove the old disk (you can’t do this until the partition is gone)
    • create new TM disk by selecting the new TM partition

This is probably old hat to experienced users of the Mac but it was new to me.

If you found this useful – let me know via @eli4d on Twitter

Notes on installing an FTP server on a Digital Ocean virtual machine running Ubuntu 14

Overview

These are some quick notes/lessons related to vsftpd installation on Ubuntu 14. My reason for creating such a server was that I wanted to collect photos for an event (from the guests that came to the event). I had originally thought this was going to be easy with my 1 TB Dropbox account. What I didn’t realize was that in order for anyone to upload to a shared Dropbox folder, that person has to have an account on Dropbox.

So rather than hassle people about creating a Dropbox account, I figured that a temporary FTP server through Digital Ocean* would be easier. While I deployed the server and got it working for my needs I later realized that I was trading the ‘you need to create a Dropbox account’ hassle with ‘you need to upload using a FTP program’. I realized that this was a bad approach too since I was dealing with users that had a wide (wide) range of technical comfort and knowledge.

* Note that my Digital Ocean links in this post are referral links – they’re a great service which I really like and I definitely recommend.

Creating a virtual machine on Digital Ocean

Creating a virtual machine (i.e. a ‘Droplet’ per Digital Ocean’s jargon) on Digital Ocean (DO) literally takes 55 seconds (which is pretty amazing). DO’s support center (https://cloud.digitalocean.com/support/suggestions) walks you through clear instructions on doing this.

I went with Ubuntu 14,04 because it is an LTS version and was likely to be quite stable. Of course I didn’t need long term support for such a short-lived server but I figured the stability would be worth it.

Creating a virtual machine on Digital Ocean

SSH Keys

DO will email your root password or you can create SSH keys and put the public one on your instance for easy log-in.

I used the https://www.digitalocean.com/community/tutorials/how-to-set-up-ssh-keys–2 instructions for ssh key association with my droplet. This line from the instructions did not work for me:

 cat ~/.ssh/id_rsa_digitalocean.pub | ssh user@123.45.56.78 "mkdir -p ~/.ssh && cat >> ~/.ssh/authorized_keys"

So I ended up destroying and re-creating my droplet and pre-associating the public key that I had just generated. Since it’s really fast, there was no big negative in doing it this way. Of course I could have used scp to the copy of the public key too if I didn’t want to re-create the droplet.

Installing and configuring the FTP server (vsftpd)

Installing vsftpd

I found pretty good instructions on https://www.digitalocean.com/community/tutorials/how-to-set-up-vsftpd-on-ubuntu-12-04 for the initial installation.

The key is to install vsftpd and configure /etc/vsftp.conf:

 apt-get install vsftpd

When looking at vsftpd’s configuration – Vim drove me a bit batty with the built-in color syntax-ing (tons of dark shades of unreadable color) and I had to turn that off. The instrutions at http://vim.wikia.com/wiki/How_to_turn_off_all_colors explained how to do this (just put these at the end of the .vimrc):

 syntax off
 set nohlsearch
 set t_Co=0

550 error

My initial run of vsftpd per the tutorial that I found yielded a 550 error. This was one of a cavalcade of errors when testing different vsftpd configurations. The long and short of it is that the ftp server can be configured in different ways (anonymous download only, download and upload, etc…). Each of these possibilities yields different permutation of options in /etc/vsftpd.conf and the potential of other supporting files (for example – virtual users need more configuration files).

My configuration goal was a single user that could upload files to his home directory. This was going to be a shared user among different people that attended the above mentioned event. My assumption was that each would put their photos in a sub-directory that I created for them (see “Conclusion” section of this post for why this was a poor assumption).

So…I needed a chrooted ‘regular’ user for this configuration. Below is my final /etc/vsftpd.conf configuration and here are some useful sources of information.

550 error

Creating the ftp user – 1

I created the user using:

 useradd ftpuser

How-To Geek has a good article about useradd. Ubuntu also has an adduser command too. Both do the same thing but I found useradd to be easier to use.

After creating the ftpuser I decided to give my ftpuser a brilliantly simple password and it was ftpuser.

 passwd ftpuser

My intent was to make it easy on my users. This was a fatal (and dumb) security mistake. I am well versed in the stupidity of security by obscurity and I fell for it thinking that ‘no one is going to find the ip of this droplet’. I cover this lesson in the “Conclusion” section of this post.

Creating the ftp user – 2

One initial issue with my user and vsftpd was this error:

 500 OOPS: vsftpd: refusing to run with writable root inside chroot()

The problem was that ftpuser’s home directory didn’t have proper permissions for chroot to work correctly. Basically, the home directory of ftpuser cannot be writeable but sub-directories need to be writeable. So I did the following:

 As ftpuser within ftpuser's home directory:
 ftpuser@myawesomedroplet:~$ chmod 755 ../ftpuser/
 ftpuser@myawesomedroplet:~$ mkdir _test
 ftpuser@myawesomedroplet:~$ chmod 555 ../ftpuser/
 ftpuser@myawesomedroplet:~$ touch test
 touch: cannot touch β€˜test’: Permission denied
 ftpuser@myawesomedroplet:~$ exit
 As root:
 root@myawesomedroplet:/home/ftpuser# service vsftpd restart

The _test directory is where I would have my logged-in user put their photos (well something better than _test)

For more info on this see: http://askubuntu.com/questions/239239/500-oops-vsftpd-refusing-to-run-with-writable-root-inside-chroot-keep-user-j

Some insecurities

Everything looks good but…

After the above configuration for both vsftpd.conf and my local user I was all set. I tested logging-in via an ftp client, changing to _test and uploading a file and it all worked swimmingly. Then the next day I tested the exact same thing and I couldn’t log into the ftpuser account. I changed the password back to *ftpuser* and in 24 hours the exact same thing happened.

Well maybe it’s a security patch thing

I thought that perhaps my system wasn’t sufficiently patched (the magical thinking trap kicking in). So I went ahead and patched it. I also used the script from https://www.digitalocean.com/community/questions/updating-new-ubuntu-droplet-security-updates to make it easier on myself.

Chris Fido in his Servers for Hackers has an even better approach to get automatic security patches using cron and the specific Ubuntu distribution.

Nope it’s not a patch thing

My ftpuser kept being inaccessible after a few hours passed since changing its password to my brilliant password of ftpuser. So I decided to ask my question on askubuntu.com:

http://askubuntu.com/questions/691375/on-ubuntu-14-04-3-something-is-changing-regular-users-password-within-24-hour

Nope it's not a patch thing

Vincent Flesouras rocks!

I got a fantastic answered from a gentleman named Vincent Flesouras.

The short answer: security by obscurity doesn’t work. I feel like Bart Simpson at the black board repeating this sentence over and over again.

Vincent Flesouras rocks!

Next action

The next step would be to throw away my Digital Ocean droplet and re-create it with something like Ansible. Since Digital Ocean charges me based on an the existence of an instance (whether it’s online or shut off it still costs), this would also save my some money and create a repeatable virtual machine.

I stopped here because I realized that the FTP server approach was the wrong approach for my audience. I think a better solution would be a webserver approach for easy upload of files (perhaps Caddy with some Golang goodness) but this will have to wait for another time because I’m out of time.

Conclusion

I learned the following lessons:

  1. Before diving into something, make absolutely sure you know how your least technical user will use your product/creation/monstrocity. I had assumed the built-in pervasiveness of ftp clients within all web browsers. The problem is that this is true but in the wrong direction for my use case. Most web browsers can connect to an anonymous ftp server to download files not upload them. Of course there are plenty of web based ftp clients, but then I’m giving a third party access to this ftp server with personal items (i.e. photos) on there. So…an ftp server was the wrong solution for this problem to begin with.
  2. Never ever ever use a super simplistic password relying on the obscurity of your server (i.e. ‘just’ an IP without an associated domain). Rationally, I knew this to be the case but there’s nothing like your brand spanking new user account on a brand spanking new virtual machine changing passwords ‘by itself’ every day.
  3. I should have used Ansible or some such orchestration software for the creation of the server. It would have allowed me to quickly and cleanly destroy/create server instances. This would have helped with testing of my server’s configuration (security and otherwise).

This was definitely a learning experience both about vsftpd and security.

Posting to WordPress by email

Overview

This is a quick post about posting to wordpress by email. WordPress has excellent instructions on posting by email ( via https://en.support.wordpress.com/post-by-email/ ). This post is more about limitations of the post-by-email feature and how to use TextExpander to be more efficient.

Why post by email?

My thinking is that I want to be able to post short snippets by email. I got the idea for this approach from Manton Reece. Like Manton I think that Twitter and Facebook are too ephemeral. So my goal is to post only short snippet-ish posts by email. I’d like to limit these posts to 200 characters if possible but I don’t have a good way of universally controlling the size of the post when posting by through my other devices (without putting my post in a separate editor).

There’s also the issue of post title – should such short posts have titles? Manton indicates that microblogs should not have a title. At this point I’m not too dogmatic about this so I’m fine with having a title where I distinguish it with a ‘Snip’ at the beginning.

To setup just follow the instructions

WordPress’s instructions are excellent

To setup just follow the instructions

When you post via email you get a response from WordPress.com

This was my test post (draft – never published) where I tested Markdown support. It’s a very nice response email telling me that WordPress received my post.

When you post via email you get a response from WordPress.com

Limitation – no Markdown via email 😦

It is somewhat strange but posting by email to WordPress does not interpret Markdown. WordPress’s instructions indicate that different email clients handle formatting differently which is why formatting is limited. Markdown is perfect for posting by email but it does not work with this approach (currently). I think that WordPress would need a new tag such as [markdown on] (with the default being off) and then interpret Markdown in the same way that the WordPress site editor interprets it.

I’ve sent WordPress support a question about this

I’ve sent WordPress support a question about wanting some Markdown love.

I've sent WordPress support a question about this

WordPress support response

WordPress support response

Using TextExpander

TextExpander is awesome and helps me reduce typing significantly both on the Mac and on iOS. Below is my current TextExpander snippet.

Some notes about the choices of the tagging:

  • I choose to explicitly put the title in the body of the email rather than through the subject line. My thinking is that this is close to the meta settings for static blog systems.
  • The [end] tag is very nice because I can put some optional tags or reminders in case I want to change things. I was initially thinking of using a pop-up menu for all the tags on my site but this seems like overkill. It was easier to just copy the tag cloud and put it in the optional section

Version 0.1 of my TextExpander snippet – the popup design choice

This was a first good try but TextExpander custom keyboard on iOS doesn’t work well with TextExpander snippet popups. In order to use popups I would need to use something like Drafts. Drafts is a great app but I don’t want to complicate my toolchain.

Version 0.1 of my TextExpander snippet - the popup design choice

Version 0.5 of my TextExpander snippet – the flat design choice

In my experiments I found out that normal publicize settings through the WordPress web editor (Twitter in my case) do not kick up without a tag push. I also added a ‘Snip’ in front of my title. I also went with a publish status that is offset by a 2 hour delay just in case I completely mess up a snippet. My original tags were also messed up because I used a gigantic phrase for one tag.

This non-popup design allows me to use the TextExpander snippet on my iPhone mail app, save it to draft and pick it up on my Mac’s mail app.

Version 0.5 of my TextExpander snippet - the flat design choice

Conclusion

It is extremely easy to set posting by email in WordPress for those that are using WordPress.com.

Pros:

  • Post from anywhere you have email access
  • Fairly good controls on post (status, date, etc..)
  • Use of TextExpander make posting consistent, reliable and easy

Cons:

  • No Markdown support (hopefully one day…come on Mattmake it so)
  • For microblogging – controlling the character count is not possible (it would be great to let WordPress.com provide feedback via a [count 200] type of tag)

One thing that I didn’t experiment with is using anchor tags to create links within my microblog post. WordPress’s docs indicate that their system will interpret HTML as follows:

As much formatting as possible will be retained, although the Post by Email system will strip unnecessary HTML tags so that your email is displayed correctly. Note that you will need to use an email client that supports rich text or HTML formatting in order to make use of this feature. Most website based clients (Hotmail, Gmail) do support this, as do most desktop clients (Outlook, Mail). You may need to switch your client into rich text or formatted mode.

HTML and email clients are still a big mess in 2015 😦 .

One last last thing – auto-posting to Twitter and Facebook. At some point I’d like to auto post the actual microblog post to Twitter and Facebook rather than a link to it via WordPress’s publicize settings. There’s probably a neat Node.js or Go solution. It’s another task on my never-ending Omnifocus list.

How to create a static content server on Amazon S3

Overview

In this tutorial I quickly go over creating a static site using S3. This should be a simple process and for the most part it is except Amazon’s security policy editor. There are many ways to control security in AWS and I beat my head against a wall for many hours trying to figure what would work. I present what worked for me but this may not be the ‘best’ way to do the security for an S3 bucket. If get more info on how to better do it I will update this post accordingly.

Assumptions:

  • You’ve created an AWS account on http://aws.amazon.com (it’s super-easy)
  • My static domain (static.eli4d.com) will use WordPress.com’s nameservers. I host this blog (besides all images and static content) on wordpress.com. The $13 is well worth my time and my content is portable due to the static server usage.

Note: Originally I had created an images.eli4d.com S3 bucket but now I am switching to static.eli4d.com. While creating the images bucket I accumulated lots of scattered notes. If there’s any references to the images bucket it is due to this initial effort.

Get an AWS account

Creating an AWS account is extremely easy and it’s faster than Amazon Prime.

Get an AWS account

How to create an S3 bucket for static.eli4d.com

Pick S3

The sheer breadth of Amazon’s web services is astounding…and it keeps growing.

Pick S3

The creation step is very simple – just click that “Create Bucket”

The only gotcha is that your bucket name should be the exact name of the domain you want to associate it with. So if I want static.eli4d.com for my static content, then I need to make a bucket name of static.eli4d.com. If that bucket name is taken (it’s universal across all of AWS) – then you’re out of luck and have to go down a more complicated route (see https://eli4d.com/2015/09/02/amazon-web-services-lesson-s3-bucket-names-are-universal-so-get-your-domain-named-s3-bucket-before-someone-else-does/ ).

The creation step is very simple - just click that "Create Bucket"

S3 Management Console

S3 Management Console

S3 Management Console

S3 Management Console

It’s ALIVE

Franken url is awake…but inaccessible

It's ALIVE

Current permissions – main account

Current permissions - main account

Time to create the index.html

Time to create the index.html

Time to create robots.txt

Time to create robots.txt

Lets get back to the bucket

Lets get back to the bucket

S3 Management Console – uploading files – 1

S3 Management Console - uploading files - 1

S3 Management Console – uploading files – 2

S3 Management Console - uploading files - 2

Upload details page

Keeping it as defaults.

Upload details page

Upload complete

Upload complete

My bucket shows the uploaded files

My bucket shows the uploaded files

Testing end point – can I see that index.html

And the answer is no. Not surprising but the answer is still NO.

It’s time to go down the rabbit hole also known as AWS permissions. This is a short trip into that hole. We’ll have a longer trip when enabling an access policy between a user and this bucket.

Testing end point - can I see that index.html

Allowing anyone to get to the S3 bucket using a browser

Where do I find my S3’s ARN?

Go to the S3 bucket and edit the bucket policy to see the bucket’s ARN. In my case the ARN is arn:aws:s3:::static.eli4d.com/*

Where do I find my S3's ARN?

Setting bucket permissions – 1

Following http://blog.learningtree.com/configuring-amazon-s3-to-serve-images/ in setting bucket properties

Setting bucket permissions - 1

Setting bucket permissions – 2

Keep in mind the following: when you click the link the AWS Policy Generator will launch in a separate browser window. You then create the policy there and then you have to copy the policy that’s created (a bunch of text) from that browser window to this browser window. This is not obvious and from a UX point of view it can be crazy-making and confusing.

Setting bucket permissions - 2

Setting bucket permissions – 3

Setting bucket permissions - 3

AWS Policy Generator

The only permission that the bucket needs to be world readable is GetObject.

AWS Policy Generator

ARN is key

You need to put correct arn:

arn:aws:s3:::static.eli4d.com/* in my case as mentioned above. Mess up the ARN and you will be slightly sad.

‘Principal’ refers to anyone who accesses the bucket (so by putting * we’re saying ‘everyone’).

ARN is key

Once you add the statement

Policy generator gives you a summary before actual generation. It’s time to click the ‘Generate Policy’ button.

Once you add the statement

Clicking the ‘Generate’ button

Side note: that version date is odd. You can’t just put today’s date as the version date.

Clicking the 'Generate' button

So you have a policy and you need to copy it

I know….you’re thinking wtf and so am I. So copy the policy. Then go back to the window where you launched the policy generator.

As a key principal here: do not modify any of this text. Seriously…don’t do it.

So you have a policy and you need to copy it

Here’s where you’re going to copy the text into

Remember that browser window from which you opened the security policy editor. Go back to that one.

Here's where you're going to copy the text into

Now paste in the policy and save it

Now paste in the policy and save it

If everything is ok policy wise you get back to the main window

There’s a really quick green checkbox and here we are (sure wish the UX was better here).

If everything is ok policy wise you get back to the main window

Time to retest the endpoint

Whohoo…now we can get to the S3 bucket.

What’s left:

  • Domain mapping of static.eli4d.com domain to this endpoint
  • Permissions to allow me to sync resources

Time to retest the endpoint

Domain mapping to the S3 bucket

My eli4d.com domain is controlled by WordPress (my registrar, however, is Hover – I LOVE Hover). These instructions apply to adding the static.eli4d.com subdomain via WordPress. I had tested some other domain configurations and this turned out to be the simplest approach (thumbs up to Hover and WordPress support). Depending on your domain configuration – you’ll have to adjust your steps accordingly when adding a subdomain.

Note: any ‘Hover’ URLs from this post are a referral link to Hover. BTW in case I didn’t mention it – I love^1000 Hover.

To the wordpress.com domain editing url

The not-so-easily found domains link on WordPress.com.

To the wordpress.com domain editing url

Lets edit the DNS

Time to add my subdomain of static.eli4d.com

Lets edit the DNS

Create a CNAME record for static.eli4d.com

The steps are to:

  1. Create the CNAME
  2. Click on the ‘Add’ button
  3. Click on the ‘Save Changes’ button

Create a CNAME record for static.eli4d.com

Check that static.eli4d.com is showing on the browser

Problem – when I type static.eli4d.com it redirects to eli4d.com – why?

The answer is DNS propagation that may take between 48 to 72 hours.

Lets pretend that 48 to 72 hours have passed

Ta-da – it works!

Hint: Use Firefox/Chrome private browsing mode to validate domain since it eliminates caching issues.

Lets pretend that 48 to 72 hours have passed

Checking in: workflow – how do I upload resources to my S3 bucket?

Now what? How do I upload my static resources to this S3 bucket? It will most likely be images but it can be anything else (so S3 accepts a maximum of 5 TB sized files). I write my blog entry on my Mac via Markdown putting the static items in the post, but then how where do I go from here to there workflow-wise?

I could just log into the AWS console and upload the resources but it feels clunky and not my type of workflow. What I want is something on the command line that syncs my resource directory to my S3 bucket. So here’s my approach:

  • find a command line utility
  • configure a user on AWS that can sync data only to this bucket (this is just basic security; I don’t want my main ‘root’-ish user to do everything from my mac); ideally I would have a user per bucket but I’ll stick to one sync user to honor some semblance of simplicity and sanity
  • configure the S3 bucket to accept connection from this user (this turned out to be a bear – AWS’s security model is breathtakingly complex)

Note: If you’re ok with just uploading resources via the AWS console then you’re done…enjoy! (please let me know via Twitter that you found these instructions useful…it encourages me to write more of this)

Finding an S3 sync command line utility

Lots of possible solutions but some outdated

Lots of possible solutions but some outdated

But there’s a promising article

at: http://serverfault.com/questions/73959/using-rsync-with-amazon-s3

An Amazon native solution would be ideal (just like using the docs straight from the horses mouth – i.e. amazon).

But there's a promising article

I want sync but…

I need to start at the beginning, so I need to backup to aws cli instructions

I want sync but...

Selecting “User Guide”

Selecting "User Guide"

Nice – the page has what I need

http://docs.aws.amazon.com/cli/latest/userguide/cli-chap-welcome.html

Nice - the page has what I need

More AWS cli documentation

More AWS cli documentation

And more AWS docs

http://docs.aws.amazon.com/AmazonS3/latest/dev/walkthrough1.html

And more AWS docs

Command line install instructions

Command line install instructions

I’m using the bundled installer since I don’t have pip but I do have Python 2.7.5

I'm using the bundled installer since I don't have pip but I do have Python 2.7.5

Installing the AWS Command Line Interface – AWS Command Line Interface

Installing the AWS Command Line Interface - AWS Command Line Interface

Sweetest command line – here we go

Just follow the instructions

Sweetest command line - here we go

The ‘aws’ command works!

Note that I moved back to my standard account rather than the admin account on the mac (trying to be secure and all that jazz)

The 'aws' command works!

The command to sync a local folder to the AWS bucket

At this point this command doesn’t work yet but it will later. All possible options for aws cli can be found here: http://docs.aws.amazon.com/cli/latest/reference/s3/sync.html

 aws s3 sync /Volumes/elev-per/Dropbox/eli4d-content/s3/static.eli4d.com/2015/ s3://static.eli4d.com/2015 --delete --exclude "*.DS_Store"

Basically the above command says sync all resources from my local directory and use the local directory as the authoritative source deleting any mismatches on the S3 bucket side (i.e. the –delete) and exclude the Mac side pollution of .DS_Store – so don’t sync those.

The fantastically awesome Nicolas Zakas and a slight sad story about S3

I happened to come across a very interesting post by Nicolas Zakas ( http://www.nczonline.net/blog/2015/08/wordpress-jekyll-my-new-blog-setup/ ).

There are 2 very interesting things:

  1. His comment about s3command was very interesting. Since I don’t regenerate all of the static content – awscli is fine for me. But it’s something to keep in mind for static blog generation.
  2. The ability of someone else to indirectly squat on his domain by taking the name as an S3 bucket. I’ve written about this here: https://eli4d.com/2015/09/02/amazon-web-services-lesson-s3-bucket-names-are-universal-so-get-your-domain-named-s3-bucket-before-someone-else-does/

The fantastically awesome Nicolas Zakas and a slight sad story about S3

Creating an S3 user for syncing

As mentioned before I need a user that can sync resources for this specific bucket

I need some Sam IAM (come on Dr. Seuss – work with me here)

http://docs.aws.amazon.com/cli/latest/userguide/cli-chap-getting-set-up.html#cli-signup%20%28IAM%29

As mentioned before I need a user that can sync resources for this specific bucket

Creating a sync user via IAM – 1

Time to go to that iam console

Creating a sync user via IAM - 1

Creating a sync user via IAM – 2

time to click that user’s link

Creating a sync user via IAM - 2

Creating a sync user via IAM – 3

Select ‘Create New Users’

Creating a sync user via IAM - 3

Creating a sync user via IAM – 4

Creating a sync user via IAM - 4

Creating a sync user via IAM – 5

Creating a sync user via IAM - 5

Creating a sync user via IAM – 6

Here is where you create an access key (I already created it). The gist is AWS creates a public/private key and you need to save it because it’s never shown to you again (i.e. the private key).

Creating a sync user via IAM - 6

Now how do I give this user access to my images bucket?

Duckducking around: https://duckduckgo.com/?q=how+add+IAM+user+to+s3

I found: http://docs.aws.amazon.com/AmazonS3/latest/dev/walkthrough1.html

Click the user to see its permissions

Click the user to see its permissions

New IAM user information

New IAM user information

Configuring aws-cli with my newly created AWS user

Time to configure

http://docs.aws.amazon.com/cli/latest/userguide/cli-chap-getting-set-up.html

Note 1: that I found my region by logging into aws console > s3 and looked at the top area for region corresponding to my s3 bucket.

Note 2: All configuration (default) is in ~/.aws/

Time to configure

Calling s3

S3 references:

http://docs.aws.amazon.com/cli/latest/userguide/cli-s3.html

http://docs.aws.amazon.com/cli/latest/reference/s3/index.html

http://docs.aws.amazon.com/cli/latest/reference/s3/ls.html

http://docs.aws.amazon.com/cli/latest/reference/s3/sync.html

Dang – I need IAM policy for my user.

Calling s3

Configuring my S3 bucket to allow sync from my eli4dsync user

This is what I want

This is what I want

Insert many head scratching hours and attempts to get this going and lots^1000 of expletives

I initially attempted to change the security policy of the S3 bucket to allow for my sync user. I got lots and lots ‘Access Denied’ messages. I scoured AWS documentation, Duckduckgo, Google, Stackoverflow, and a Lynda course about AWS. Somewhere along all of this I figured that maybe I need to approach this from the other side – the eli4dsync user and that maybe I should attach a policy to the user in terms of the bucket access. This is the approach that worked for me but it may not be the right approach. If someone at Amazon would clarify the way security policy works – I would love to write that up (so open invitation AWS people with security policy information to get in touch).

Image credit: https://flic.kr/p/bMGA1T

Insert many head scratching hours and attempts to get this going and lots^1000 of expletives

Applying an inline policy to the IAM user rather than the S3 bucket

Per http://blogs.aws.amazon.com/security/post/Tx3VRSWZ6B3SHAV/Writing-IAM-Policies-How-to-grant-access-to-an-Amazon-S3-bucket

So initially – it looks like this article talks about s3 policy but it isn’t about the s3 bucket but rather the IAM user.

Applying an inline policy to the IAM user rather than the S3 bucket

Testing my sync code against my changes I find that this one works

So there are two parts:

Part (1) applies to the whole bucket. ListObjects is needed for recursion that occurs through the awscli sync command (think subdirectories of files and syncing them…though S3 doesn’t have a file hierarchy concept).

Part (2) applies to objects that are within buckets.

With this inline policy my sync user does NOT have carte blanche – it’s the right thing (for my purposes).

Testing my sync code against my changes I find that this one works

It works!!!

My sync script works and I have a very specific policy for my sync user.

It works!!!

Conclusion and Thanks

That’s it.

As you can tell – the AWS security policy creation is the biggest head scratcher. The rest if fairly straightforward.

My thanks to the folks that created the following resources and/or answered my questions:

Please let me know via Twitter (https://twitter.com/eli4d) that you found these instructions useful…it encourages me to write more of this.

How to Create an Encrypted Zip file on the Mac

It all started with a text message from my wife about needing help with a password on a zip file:

wife: I tried to put password in zip file

wife: I followed this page http://smallbusiness.chron.com/create-encrypted-zip-files-mac-44338.html

me: command of: zip -e myzipfile.zip file1.txt file2.txt OR zip -r -e myzipfile ./directory the tricky part is the terminal

wife: I used the terminal but for some reasons, it only creates empty folder
…a bunch of text messages later…

me: Here you go – give them this link…

My wife is really smart and quite a decent user of her Mac. She needed to encrypt a zip file because whe was sending some paystubs to some bank loan people.

I know what you are starting to think “but encrypting a zip file is insecure”. And you’re probably right but the fact of the matter is bank employees are severely constrained by their employers and you’re LUCKY if they’re even allowed to open an encrypted zip file. It’s not that they’re incapable of such a feat, but rather that they’re in a financial institution with a whole lot of rules and regulations that make secure electronic delivery of anything quite debatable.

There should be an easy..practically trivial way to compress a directory and put a password on the Mac but the “encrypted with password” part is not easy at all. In this tutorial I’ll walk you through how to do this.

And now for the instructions

These instructions and pictures were done on Yosemite but should apply to future versions of Mac OS X.

Creating a non-encrypted zip file is easy

Lets use an example directory called my directory

Creating a non-encrypted zip file is easy

Right click on your folder

Choose the ‘Compress’ selection

Right click on your folder

And now you have a zip file

It’s easy peasy – you now have a *my directory.zip *file. But it isn’t encrypted with a password. Anyone can double click on it and it would show its contents without difficulty.

And now you have a zip file

Creating an encrypted zip file

To create an encrypted zip file you need to use the Mac’s command line. The command line is a vastly different way to interact with the file system. Most of the tutorials that I have seen do not exactly explain how to use the command line. I will make a stab at making this tutorial clearly explain how to use the command line to create the encrypted zip file of a directory.

Before we proceed – you need to agree that you will follow my instructions to the letter….I’m assuming you are nodding your head with a ‘yes’. If you deviate in any way and you get surprising results then go back and try again by following the exact steps.

The usual disclaimer applies – use at your own risk and I’m not responsible if you destroy your Mac πŸ™‚

Create a mydirectory directory in the Documents folder of your mac

Make sure that the mydirectory doesn’t have any spaces in the name.

Note: Your mydirectory folder doesn’t have to be in Documents. It can be anywhere that you can get to with the Finder.

Create a mydirectory directory in the Documents folder of your mac

Place your files in the mydirectory directory

Place your files in the mydirectory directory

Search for terminal in spotlight

Spotlight is the mac’s search facility. If you don’t see it in Finder just press the following keys: COMMAND SPACEBAR

The command is the key with the clover leaf symbol and the spacebar is….the spacebar. When you see the search box type terminal and then double-click on the suggested program like the one shown in the image (below).

Search for terminal in spotlight

Use the Terminal to change directory to mydirectory

In the terminal we’re going to change directory (cd) into the mydirectory folder. Remember that the Terminal is a completely different way of interacting with your file system (the other way is visually through the Finder).

In the terminal you will be typing the cd command followed by a space (i.e. pressing the spacebar).

  1. In the terminal type: cd
  2. With your mouse drag the *mydirectory *folder from the finder window into the terminal

In the next step you will press the enter key on your keyboard and within the Terminal you will have changed the directory into mydirectory.

Use the Terminal to change directory to mydirectory

Press the enter key

You should now be in the mydirectory folder. Congratulations…we’re almost there.

Press the enter key

Now type cd ..

Type: cd ..

The key thing is typing the letter c and d then a space with the spacebar followed by two periods.

Now type cd ..

It’s time to zip up the directory with and encrypt it with a password of your choice

Type: zip -re myzipfile.zip ./mydirectory/

There’s a space (i.e. press the spacebar) between the words (see the red lines in picture).

Note that myzipfile.zip is the encrypted zip file that will hold your files. You can use another name for it but make sure that you don’t put spaces in the file name.

It's time to zip up the directory with and encrypt it with a password of your choice

Now press the enter key

You’ll be prompted for the encryption password so enter whatever password that you want to use and then press enter. Then type it in again when you see ‘Verify password:’

Now press the enter key

What you should see

After you re-enter the password in ‘Verify password:’ and then press the enter key you should see the directory being zipped and encrypted.

What you should see

Time to test the zip file

You’re pretty much done. It’s a good practice to verify that the directory is encrypted in the zip file. So we will use the Terminal that you already have open to open a Finder window to the directory where the zip file is located.

Type: open .

There’s a space between the open and the period.

Time to test the zip file

Open the new zip file in the Finder window

Use the new Finder window to find the zip file that you just created (myzipfile.zip in this case). Double-click on the file.

Open the new zip file in the Finder window

When you double click the zip file

If you have properly encrypted the zip file then you should get the picture shown below. If you don’t see this then you probably didn’t encrypt the zip file.

Note: the “hello.txt” message just refers to the first file in the directory. When you put in the correct password the zip file will be fully decrypted, so all files will be decrypted.

When you double click the zip file

After you put the correct password

After you put the correct password you should see your directory in the finder windows.

Note: In the image below there is a ‘mydirectory 2’ The reason there’s a ‘2’ next to the directory name because your original directory is in the same location.

After you put the correct password

Open the directory and verify that the you can open the files without issue

Open the directory and verify that the you can open the files without issue

You’re done – CONGRATULATIONS!!!

It’s time to put on your party hat and do the happy dance. You’re done!

Image credit: http://emojipedia.org/party-popper/

You're done - CONGRATULATIONS!!!

Let me know how I can make these instructions better.

If you think I can do a better job with this tutorial – let me know via twitter (@eli4d).

One last thing…

If you think that Apple should improve the encryption zip file – send Tim Cook an email at tcook@apple.com (I’m serious). Looking for an idea of what to send? Here’s a sample:

Email Subject:

Can you (pretty) please improve the Mac OS X Archive utility to allow zip encryption from the Finder

Email Body:

Dear Tim,

*Can you kindly ask your Mac OS X team to add an encryption capability to the Mac’s Archive utility so I can use the the Finder’s built in ‘Compress’ service rather than use tortuous command line tutorials like https://eli4d.com/2015/09/23/how-to-create-an-encrypted-zip-file-on-the-mac

Sincerely,

ENTER_YOUR_FIRST_NAME