I had a project where every single media library image URL needed to be filtered, backend and front end. As far as I’ve found these were the filters for every area:
The myplugin_filter_html_image_urls() function peals out images with regex from areas where it’s not just the URL being sent. This function may require verification of the current domain incase bad practices of using external images is being done.
Passing variables into WordPress hooks using an anonymous function I thought was impossible. I always worked around this and rewrote the logic. Today I found out anonymous functions support a use keyword, allowing passing:
So simple and easy. Shaves hours of troubleshooting.
tinypng-cli is an amazing tool to compress an image, or a directory of images with a single command. It’s beautiful for fixing the few images that a normal compressors miss or don’t do a good job on, or the images that slip between the cracks.
A big part of my day is ensuring all my sites have very high Google Page Speed scores, which are heavily factored by images alone. I sure I wish I had known about this tool years sooner, would of saved me hours of manually work.
I explored and tested a bunch of WordPress plugins for S3 – they’re excellent in their own right, but I was bothered by the bloat & weight on PHP for the backup. Don’t get me wrong, I understand it’s a complex undertaking when you’re creating a backup/restore UI, working off wp-cron, creating many features that benefit lots of people, etc. But I didn’t need any of that – I just need my server backed up without dealing with anything.
Quick search of Github lead me to a beautiful shell script that uses wp-cli and awscli to preform the backups. Here’s the kicker: it takes less than 40 lines of code. I modified it a bit for my needs and it works better than I thought possible:
Amazing. Runs with sh backup.sh thrown into crontab. Amazing.
Update: This was too hard to maintain, and things “got weird” sometimes. I’ve moved to Dropbox instead, very happy.
I have my localhost server on iCloud. Cloud syncing, in my opinion, is the most reliable way to ensure what you’re working on is backedup to the second.
However, 18,000+ file folders like /node_modules/ is just ridiculous to have syncing. I have nothing to prove this, but I’m certain having one or more folders like this would at some point negatively effect syncing and/or indexing.
So, as far as I’ve found the best way to prevent folders like /node_modules/ from syncing is appending .nosync to the end, then symbolically linking to.
This gives you a nice little “Ineligible” tag on your huge folder, and your symbolic link sits there.