TinyPNG & TinyJPG in Command Line

tinypng-cli is an amazing tool to compress an image, or a directory of images with a single command. It’s beautiful for fixing the few images that a normal compressors miss or don’t do a good job on, or the images that slip between the cracks.

A big part of my day is ensuring all my sites have very high Google Page Speed scores, which are heavily factored by images alone. I sure I wish I had known about this tool years sooner, would of saved me hours of manually work.

Really simple to setup:

  • Get a API key from TinyPNG (free tier: 500 image cap)
  • Have npm installed on your server
  • Install tinypng-cli npm install -g tinypng-cli
  • Start compressing tinypng demo.png -k #####

Prevent iCloud from Syncing /node_modules/ Folder

Update: This was too hard to maintain, and things “got weird” sometimes. I’ve moved to Dropbox instead, very happy.


I have my localhost server on iCloud. Cloud syncing, in my opinion, is the most reliable way to ensure what you’re working on is backedup to the second.

However, 18,000+ file folders like /node_modules/ is just ridiculous to have syncing. I have nothing to prove this, but I’m certain having one or more folders like this would at some point negatively effect syncing and/or indexing.

So, as far as I’ve found the best way to prevent folders like /node_modules/ from syncing is appending .nosync to the end, then symbolically linking to.

This gives you a nice little “Ineligible” tag on your huge folder, and your symbolic link sits there.

To setup, in your project folder, just run

npm install
mv node_modules node_modules.nosync
ln -s node_modules.nosync/ node_modules

Server Stats on Desktop

Knowing how my server is preforming at all times kind of an hobby with mine. I know I can subscribe to some tools and services that display this info in beautiful graphics and charts, but I like simple things, and I like using default stuff. Years ago I wrote about a similar script to this, that used this info with jQuery gauges and Panic StatusBoard, but this is a much cleaner and easier method, and much simpler, and I just like it a whole heck of a lot better. So to do this, we have three steps:

  1. Remote Server cronjob that puts top output into a non-public server_top.txt file
  2. Remote Server public PHP file server_top.php that regex’s and sorts out the server_top.txt values and creates a serialized array of our desired stats for public
  3. Finally a local PHP file geeklet-server_top.php file that converts theserialized array into something and put it on our desktop with GeekTool

First on Remote Server we need to make sure top outputs all CPU’s so we can factor them all in and find an average. To do that, run top and hit the key 1 this will reveal all CPUs. Next hit W to save this CPU-revealed configuration as default.

We then setup a cronjob with crontab -e that puts the top result in a file in a non-public directory:

Second, with our server outputing top into a non-public file, we don’t want to share all the info, so we’ll be cryptic with this next script. We’ll make a file in the public directory called server_top.php and we’ll read the .txt file, and only output a vauge array of non-compromising data, which we’ll interperut locally in the next step:

Lastly, with out Remote Server now giving us an array of the CPU, Memeory, Average Load, Top Time, and HDD size, we’ll render this data on our local server into bars for GeekTool:

We locally, now have a script creating our server stats into bars. We then create a new Shell Geeklet and run the command:

Then set the font to somthing monospace for lining up the text, and to refresh as often as our crontab does, and that’s all!

You’ll notice there’s a warning point, which simply changes the progress bar to somthing more attention grabbing if any of the stats are above a conerning percentage. If static text hidden on the desktop isn’t enough, you could take this script further and use mail() or to email you a notification of this high stat.