I’ve found that I need to be working on a variety of projects nearly all day, every day, and in a wide variety of settings and environments. That means I need to be just as functional on a random Linux terminal as I am on my Laptop, and in turn I need to just as functional on those as I am on my dedicated work station at home.
Not being tied to a single machine is beneficial. Being able to get work done on anything, anytime, anywhere, is essential. Here’s how I make it work:
Make Your Data Portable and Accessible
Use Cloud Storage to Always Have Access to Your Data
You can’t work if you don’t have access to your projects. To make sure I can always access my files, I use cloud services like Dropbox and Google Drive. By using cloud storage, my work is always in a consistent state across all my machines. It’s automatic and it’s (nearly) seamless.
If you’re trying to be frugal, make use of multiple providers and prioritize them for different things. For example, because I don’t want to spend much money (I am a student after all), I use both Dropbox and Google Drive. How I divide things up is a little odd, and very particular to my case, so I’m not recommending this to everyone.
I pay for Google Drive, while I don’t pay for Dropbox, which is odd since I
actually like Dropbox more than Google Drive. Indeed, Dropbox offers a much
better synchronizing experience than Google Drive. The Dropbox client is a
much lighter client, often taking
1/10 as much memory as Google
Drive. Also, Dropbox allows for network rate limiting and LAN syncing, while
Google Drive does not. The lack of network control is the downside that’s most
noticeable for me, since I have enough RAM in each of my machines that I
rarely have a single program impact performance. Without control of networking
though, Google Drive will use all the bandwidth it can get, and can completely
clobber a connection. However, ultimately Dropbox is more expensive than
Google Drive (while writing this I’ve learned Dropbox has gotten cheaper since
I last checked, and as a result I may be making some changes!).
Since I use two different providers with two different size offerings (100GB on Google Drive, just 6GB on Dropbox), I split my use of them accordingly:
Dropbox is where I store all of my most needed and most sensitive data. This is things like projects that I work on day-to-day and my school work. Additionally, I use Dropbox for sharing files between my handheld devices (Kindle Fire and my Cell Phone) since their Android interface is so much better than Google Drive’s.
Google Drive is where I store pretty much everything else that I might need on a day to day basis, but that I won’t need as urgently. This includes things like my (relatively small) photo collection, game saves across computers, or other persistent program data.
Something to note is that I could arrange this in a nested configuration, so Dropbox sits within Google Drive. Then, all my most important files will be automatically saved to both services. Though this is a good idea in theory, I’ve had enough problems with Google Drive conflicting with various tools that also want to monitor it’s files that I’m weary of spending time on this.
Track your Progress and Control Changes
git (or any VCS) to Organize Your Work Through Time
Prior to being introduced to the concept of Version Control Systems (VCS), tracking edits and changes as time passed was arduous and clumsy. I’d create a new copy of whatever I was working on and edit that, saving with the revision number. I’d end up with series of folders that looked like this:
. ├── myProjectv1 │ ├── configurationFile.config │ ├── otherImportantProjectFile.code │ ├── other_project_files/ │ └── projectFile.code └── myProjectv2 ├── configurationFile.config ├── otherImportantProjectFile.code ├── other_project_files/ └── projectFile.code
It could get really messy, really really quickly. And if it was a project that other people where working on as well, you’d then have to manually integrate changes, a process which might drive a person mad. The problems with this kind of system are many, and I wanted a better way.
VCS allow for seamless control and tracking, and when I started working with others I was introduced to them. I’ve found that though I’m a single person, by applying the same methods of management to my own work that I’d apply to group work, I save time and extra cycles dealing with a bad system.
For those who aren’t aware,
git is a version control tool that makes
tracking changes to a project easy, even with large numbers of people
involved. The features that let it work so well as a collaborative tool also
make it great for a single user: it lets you track, in both very fine and very
course increments, the changes and the progress of a project or file. I find
this invaluable for nearly everything I do, from the notes I take in class to
the side projects I work on every day. The paper trail built by using
makes coming back to a project easy, since you can see how it evolved as time
passed the last time you looked at it. Other times, you need to see when you
made a change, for example, to know what day you added a particular section to
your class notes.
git allows you to divide up and share your projects with any
computer that also has
git installed (if you have a remote repository that
you can also access). In this way, when push comes to shove you can use
as a synchronization tool across computers.
This uber-ghetto sync method is primarily how I get work done on things like
miscellaneous Linux terminals, where I need to get my files but I can’t access
them via Dropbox. All I need to do is clone the repository that’s sitting on
my remote server, and I’m ready to do my work. Once I’ve done my work, I
commit and push it back to the remote server. When I get back to one of my
computers that have access to my synchronized files, I just do a
and now my work is synced across all my computers.
By using version control to track my files, it helps me stay more organized,
forces me to do more self documentation, and gives me a simple sharing and
synchronization tool all in one. I find
Work to Make Your Tools and Environment Consistent, Even Across Platforms
However you get work done, try to find a way to do it consistently across all your computers, preferably by using tools that work wherever you need them. For example, I spend pretty much all day working with text, so I need a text editor that’s sufficiently powerful. Additionally, to follow my own rules about unifying my environment, I need a text editor that works on all the platforms I use (Windows and Linux). I finally settled on Sublime Text 2, though many people pick tools like Vim or Emacs for the same reasons I picked ST2.
Additionally, I work with the command line constantly, so I use Cygwin on Windows to simulate the “*NIX” environment I get on the other platforms I use.
Within the command line, I use a custom configuration that makes the terminal experience even more to my liking. This includes aliases to common commands, scripts I find useful, and terminal colors to help when reading lots of text.
There are plenty of other small things you can do to make your environment more consistent, that have less to do with the tools you use and more to do with the “feel” of your environment. A specific example of mine: on every computer I use for long periods of time I install Spotify so I can listen to all my music without missing a beat. On top of that, I’ve found ways to always have the exact same key combinations for controlling that music, so I don’t have to think about it when I want to skip a song or pause the music. It’s often these little details that make something feel distinctly “yours” and make you feel the most at home.
Automate the Setup of Your Environment and Tools
It’s key to be able to set up a new environment fast. Having all these other procedures does you no good if switching computers halts all your work. Now, I wouldn’t go so far as to say that you must be able instantly set up a replica environment as soon as the need arises, but it’s good to have at least a basic level of functionality available quickly. For me this takes the shape of a “bare-bones” install setting in my dotfiles repo that lets me set up a *NIX environment with the bare bones of what I need and with as little chance of breaking things as possible. It basically sets up my .bashrc, my Vim color- scheme and the Vim plugins that I like. That way, even if I only have a terminal I can get to work.
For more extended use though, it can be really wonderful to be able to just run a script and have your entire developer environment set up just so.
The added benefit of all this is that you’ll come to know your tools much
more closely when you have to figure how to get at their dirty underbellies
for this kind of automation. In fact, you may even end up building solutions
to some rather unique and worthwhile problems in the course of making all this
happen. However, even if you don’t end up doing anything quite so amazing,
you’ll at the very least end up learning (and in the end, isn’t that what it’s
If you take just one thing from this whole post, make sure it’s this advice. Documenting your environment is totally key to all of the above, in particular writing down how you go about fixing the various problems you encounter.
All the experience that I have actually comes just from this one rule, a rule I set for myself some years ago. Way before I knew anything about programming I was constantly messing with existing software, trying to do cool things. Back then, I’d really frequently reach then limit of my own knowledge and would come crying to Google for answers. As time went on and what I worked on fell further and further from the mainstream, it became harder and harder to find the answers that I needed. My experiences from those days are perfectly summed up by this XKCD (there really is an XKCD for everything). After being spurned so many times by so many different problems I vowed that I’d forever record how I solved the problems that I encountered with the software that I had. Indeed, I’ve left these instructions scattered across the web in various forums, blogs, pastebins and gists.
Documenting procedures and solutions is vital to your mental well-being if you’re trying to be nimble in how you deal with (sometimes obscure) software. It’s the half-way step between automating things out for yourself, and you’ll thank yourself when you revisit some random piece of software three years after you last touched it and you run up against that same weird bug that stumps Google, that bug that you did eventually find a workaround for and that you documented!
Trust me, you’ll never regret writing things like this down.
That’s All Folks
- Use cloud storage to always have access to your data
- Use version control software to track your work through time
- Unify your work environment, make it consistent
- Automate the setup of your environment for minimal time wasted
- Document how you solve your problems (for your own sanity)