Half-Elf on Tech

Thoughts From a Professional Lesbian

Tag: dreamobjects

  • cPanel ❤️ DreamObjects

    cPanel ❤️ DreamObjects

    This is something I’ve wanted for a long time. I opened a ticket with cPanel about it yonks ago as Ceph storage is offered by more than just Amazon, and yet cPanel was making it super hard to use for backups.

    Well in the next release of cPanel, this will no longer be the case! If you’re on version 74 (which is in release stage, but not current, so most people do not have it yet) you can do this.

    Add A New Backup Option

    Go to Home > Backups and open up the settings.

    In there, you can add a new Backup option. Pick S3 Compatible:

    Backing up from cPanel to DreamObjects will soon be a reality.

    Configure for DreamObjects

    Now just throw in the right data:

    You’ll want to use objects-us-east1.dream.io for the endpoint, and then your bucket and keys.

    Back it Up

    And with that you’re done. Thank you, cPanel!

  • Remoteless SVG

    Remoteless SVG

    I spent about half a year working on being able to remote load my SVG, and I wanted to for a few reasons:

    1. Having only one place to update my icons is more efficient
    2. They don’t need to be in code versioning control (like GitHub)
    3. Licensing (not all are free)

    I solved those problems when I finally figured out SVG Injection (via javascript) and CORS to allow the scripts from my objects.

    It’s not all lollipops and sunshine

    However. There was, and is, one really big issue. The icons being remote mean if the remote server isn’t accessible but my site is, I got no icons. Now, the smart dev in me knows that I should have a check. If the server is up, use the icons remotely. Otherwise, use a fallback.

    There was one big, and in the end insurmountable, problem with that. There was no way to do that without slowing my site down, which was the headache in the first place. If I check the uptime for every icon on a page, the site went from a 1 second load to up to 20 seconds. In the best solution, I added 2-5 second load to every page checking if the server was up on every page load. Which is kind of the problem with wp_remote_get() in the first place.

    Make ’em local again

    This meant the ‘best’ solution was to make them local. Again. Yaaay.

    But I really didn’t want to have them stashed in GitHub. I wanted my ‘base’ to be the cloud storage, and then that needed to sync. Normally I would use rsync for that but you can’t rsync from CEPH storage.

    Don’t panic. There are answers.

    Boto-Rsync

    If you’re on DreamPress or any DreamHost server, they’ve forked boto-rsync and (if you set up a .boto file) you can do this:

    boto-rsync --endpoint objects-us-west-1.dream.io s3://my-cool-icons/ /home/my_user/mydomain.com/wp-content/uploads/my-cool-icons/
    

    Using cron or some other scheduling tool, setting up a server to run that at regular intervals isn’t too terrible.

    AWS CLI

    There’s also the option of AWS CLI, which is very similar, only it’s actually maintained. The advantage with boto-rsync is that it’s installed on the servers at DreamHost. If you have AWS CLI on your server, you can do this:

    aws s3 --endpoint-url https://objects-us-west-1.dream.io sync s3://my-cool-icons/ /home/my_user/mydomain.com/wp-content/uploads/my-cool-icons/
    

    Keep in mind, you need to configure this either with aws configure or an ~/.aws/credentials files.

    Pushing from the outside inside

    You may have noticed that, unlike rsync, I can’t tell it to push to another server. That is, I can say “sync files from server A to server B” with rsync and I can’t with either boto-rsync or the AWS CLI.

    And no, you just can’t.

    Unless you use rclone. Which is a little messy but totally do-able.

    There’s one other option though… I’ve been using Codeship to push code. That is, every time my GitHub repos update, it triggers a Codeship deploy. And in that deploy I usually just say “Rsync a git clone, k’thnx’bai.”

    Now I tell it this too:

    # Download from S3
    pip install awscli
    aws s3 --endpoint-url https://objects-us-west-1.dream.io sync s3://my-cool-icons/ ~/my-cool-icons/
    # Copy up to Server
    rsync -avz -e "ssh" ~/lezpress-icons/ USER@DOMAIN:/home/USER/DOMAIN/wp-content/uploads/my-cool-icons/ --delete
    

    I will note there’s one extra trick. You have to add Environment Variables for AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY on Codeship.

    This works for me since there are few times I push an icon without also making some code changes to the server code anyway.

    Hopefully if gives you some directions.

  • Housing Large Media Files

    Housing Large Media Files

    For the most part, the WordPress media library is fine. It falls down when we start needing to upload large files, though, for a variety of reasons. When we look at files like large PDFs or movies or podcasts, it’s really not a great solution to upload through WordPress itself. It’s slow, it’s clunky, and worst of all, those large file downloads can slow your site.

    The ‘right’ fix is to offload large media to servers that are built for this sort of thing. And in this case, I’m talking about Amazon AWS or DreamObjects.

    Of course, if you search for solutions like this, you’ll be disappointed. You will mostly find plugins that are geared towards syncing your media library with the cloud services. To be honest, the more I think about doing that, the less I feel like it’s a sustainable idea. Unless the CDN is super fast, it could actually make your site worse off by adding another domain to download from.

    I Don’t Trust Simple CDNs

    I’ve always been skeptical of CDNs in general. When there’s a shared library, it makes sense for everyone to call the same library. That keeps the world in sync. But your own media? The reason a CDN is good is that you can distribute your content across multiple locations. Provided you can actually, you know, do that. And keep them all in sync.

    Before hosting, I worked at a bank, and one of the headaches we had was pushing software updates across multiple servers and locations. After all, you can’t just upgrade the Chicago servers and not the LA and Atlanta ones. Plus you have to do them all at the same time, or make sure Jane in Idaho isn’t in the middle of depositing money when we reboot her server.

    Knowing how crazy all that is, I worry about keeping data in sync across all the servers. What happens when media is updated? Is the CDN built so that my primary location properly triggers updates for everything else, and the data is updated? No matter what, I’m sure I’ll end up with some data out of sync for at least a little while.

    In short, CDN synchronization isn’t simple and anyone who tells me it is, is selling something.

    So Why A CDN At All?

    Big files.

    The goal of a CDN is to speed up delivery of content without slowing down your website. For most images on a website, this isn’t a huge issue. But for those big files, it sure is. And uploading them to the could means three things:

    1. No lost disk space
    2. No lost bandwidth (if someone’s watching a movie for example)
    3. No lost speed (see the aforementioned movie)

    The rest of your CDN ‘needs’ can be handled properly by caching. I prefer server side, but as you like it. This means if I upload my large files to the CDN, I can link directly to them in my post content. Everyone wins.

    Except Uploading Sucks

    The common solution is to manually upload the file via a client like Cyberduck or Transmit, copy the URL, and then paste it into a blog post. Yuck. What I need is a file manager for the cloud. And that doesn’t seem to exist for WordPress.

    So I made something. DreamHost Objects Dropzone lets me upload files to DreamObjects, through WordPress, without touching the file server at all. It’s not perfect. It can be slow when trying to get stats on all the items in a bucket, and I don’t quite have an interface to make it easy to insert links and content into posts. Yet.

    Something to look forward to though.

  • PSA: DreamObjects URL Changes

    PSA: DreamObjects URL Changes

    If you use my DreamObjects plugins, don’t worry, I’ll have them updated before September.

    What Changed?

    As part of ongoing service improvements, DreamHost made a subtle, but critical, change to how everyone accesses DreamObjects. Specifically, they changed the DreamObjects hostname to prepare for upcoming enhancements.

    Old hostname: objects.dreamhost.com
    New hostname: objects-us-east-1.dream.io

    I’m a developer who uses DreamObjects. What do I need to do?

    If you were ever using the URL objects.dreamhost.com in your site, you need to change it to objects-us-east-1.dream.io and everything will be awesome.

    Is it really that simple?

    Not always. For some plugins and code, it is that simple. For my backup plugin, I added in a new option:

    if ( !get_option('dh-do-hostname')) {update_option( 'dh-do-hostname', 'objects-us-east-1.dream.io' );}

    This way I can later make it a dropdown for people to select as they want.

    But for my CDN plugin there’s a major issue. You see, the way it works is that it updates the URLs of images within your posts. It, literally, edits your post content. And that means a change has to go back and fix all the URLs. I had to write some fairly weird code to do this. I’m still testing it fully, and I think it’ll do everything I need, except it’s not going to perfect.

    Right now it does it’s utmost best to fix any URLs on a site, however it will only fix the ones for images it can detect are on DreamSpeed. I am aware that some people with a phenomenal number of images manually copied their uploads folder to DreamObjects and ran a command like this:

    wp search-replace 'example.com/wp-content/uploads' 'objects.dreamhost.com/bucketname/wp-content/uploads'

    Or maybe they used the InterconnectDB script or another plugin. Those people you’re going to have to watch out for and inform in a useful way as to how to fix it.

    I’m an end user. Do I need to care?

    Yes. You do. Do not try to fix it on your own just yet. Not until the plugins you use have updated and said they’ve corrected the issue. Then read the FAQ or any alerts to make sure you don’t need to do anything else. If you’re not using DreamObjects as a CDN, this should be pretty painless.

    If you did, or if you moved it all manually, you will have to do it again before September 5th, 2016 or all images will break.

    Thankfully, if you’re on DreamHost, you can run the following command:

    wp search-replace objects.dreamhost.com objects-us-east-1.dream.io

    That will upgrade everything for you. If you’re not, you’ll need to use the search/replace tool of your choice.

    Again: WAIT until any plugin you use for this has been updated. I personally contacted everyone in the WordPress.org repository and informed them about it, but since I know that’s not perfect, I’m doing this too.

    This sucks!

    I know. It does. But I don’t see a better way around it.

    When will your plugins be updated?

    Before the end of May. I just want to test it on as many large sites as I can find.

  • Prettier DreamObjects Listing

    Prettier DreamObjects Listing

    A long time ago, I wrote about how to list your objects on DreamObjects. This code still works. I’m using it today, and as long as my bucket is public, it’s perfect.

    It’s also kind of ugly and a hassle to get the link. I have to right-click every time, and really I just want to grab the link, paste it somewhere else, and go.

    Thankfully it’s 2016 and there’s a library called clipboard.js that does this.

    Before I did that, I decided how I wanted the layout to be. Two buttons, one for a link copy and one for a link to open in a new window.

    The button list

    Looks pretty basic, I know, but it’s better than that list.

    Next I took my foreach loop and made it this:

    	foreach ($bucket_contents as $file){
    	
    	    $fname = $file['name'];
    	    $fname_pretty = strtolower( substr( $file['name'] , 0, strrpos( $file['name'] , ".") ) );
    	    $furl = "http://".$bucketName.".objects.dreamhost.com/".$fname;
    	
    	    echo '<div id="example-text" class="example">';
            echo '<button class="btn" data-clipboard-demo data-clipboard-action="copy" data-clipboard-text="'.$furl.'" aria-true="Copied!">Copy link to '.$fname_pretty.'</button> ';
            echo ' <a class="btn" href="'.$furl.'" target="new">Open link to '.$fname_pretty.'</a>';
            echo '</div>';
    	}
    

    This gave me the output and set me up for the script code at the bottom of the page:

        <script src="clipboard.min.js"></script>
        <script src="tooltips.js"></script>
        <script>
        var btns = document.querySelectorAll('button');
        var clipboard = new Clipboard(btns);
        clipboard.on('success', function(e) {
            console.log(e);
            showTooltip(e.trigger, 'Copied!');
        });
        clipboard.on('error', function(e) {
            console.log(e);
        });
        </script>
    

    Obviously clipboard.min.js is what you think it is, but the tooltips is so you can have that nice showTooltip popup so you know what you just did.

    When you combine that with the work I did the other day for a droplet, all I have to do is keep that page up and handy to throw links at people.

    You can see it at https://ipstenu.org/dreamobjects/.

    Before you ask, since not everything there is an image (‘Wrong’ is Dom Deluise shouting that), and since the images are all varying sizes, I went with not showing the image at all.

  • DreamObjects Droplet

    DreamObjects Droplet

    I have DreamObjects, I have Transmit, and I have gifs I like to throw at people. I’d been using a tool called CloudUp, which is cool, but it had been broken for a while and was bothering me, so what I really wanted was a way to save my images to DreamObjects. It’s dogfooding, right?

    Add To Transmit

    I like Transmit. I have the desktop app and I’ve used it for years. They do it right. Early on they added S3 to their app, so adding DreamObjects to it is so simple it’s painless. We have cool directions on the DreamHost KB about how to do it.

    Make a Droplet

    Once you’re connected, you need to make a droplet. I happen to have a bucket called ipstenu-images which I use for all my images. I know, it’s totally imaginative. Next I hit right-click and selected ‘Save Droplet For Folder’. It looks like this:

    Select the folder you want and right-click. Pick Make a droplet

    I chose to save mine in the ‘favorites’ folder, which I sync on Dropbox. I did choose to save the password information as well, which if you’re doing this on your own computer is relatively safe.

    When it saved, it opened up a Finder window with my new Droplet, named “DreamObjects droplet.” I dragged that to my taskbar and tested it by dropping an image.

    nope

    That’s my default test image.

    Set Permissions Properly

    Except it didn’t work!

    Permissions Error

    The permissions error was because the upload defaulted to only letting the owner see the image. Thankfully this is an easy preference setting in Transmit. Go into Preferences, then Rules, and at the bottom you have permissions. Click on the S3 tab and change “Read” from “Owner” to “World.”

    Change S3 Defaults to READ: WORLD

    Now anyone can go to https://ipstenu-images.objects.dreamhost.com/nope.gif and see a nope-ta-pus!