I haven’t been keeping track of daily progress as I had been before, so here’s a rough outline of our progression. Largely we’ve been spending time figuring out how to use CouchDB.
Setting up Platforms
With Node.js somewhat understood, we worked on expanding our understanding of the development platforms.
Jason chose to use a Virtual Box VM of Linux Mint, installing Node.js and CouchDB on it, and then running the linux desktop in the VM window on his PC laptop. He can use the linux terminal windows and browsers.
I chose to create a Node.js + CouchDB VM based on the Bitnami VM (Ubuntu 12.04), and then create an SSH tunnel from my PC to the Node.js port in the VM. This allows me to use my PC’s terminal (PuTTY) and normal desktop browser, which was my preference.
Picking a Database
Jason decided he wanted to look at CouchDB as the database of his choice. We spent a couple of days wrapping our head around the way it worked, and then understanding why we’d want to use it:
CouchDB is designed for use on the web, so it speaks the HTTP protocol, and implements the GET, POST, DELETE, etc actions defined in it for its operations. To make CouchDB do something, you connect to special URLs using an HTTP connection to its TCP/IP port on the server. This is separate from the usual web server running on port 80. You do all your normal database actions this way. Node.js has several modules that make connecting to your CouchDB database less HTTP-like and more programmatic in feel, but to understand Couch you should understand how HTTP works.
When it comes to getting data out of the database, CouchDB doesn’t use “joins” to select a subset of matching records collected from multiple tables using SQL. Instead, you define a “Map” function that goes through all the data and collects the possible values you want to go over into a big list. Then, to get the specific values you want out of that list, you run a “Reduce” function that then outputs a final list. Together, this is known as “MapReduce”. The big advantage? It scales! It’s easier to parallelize operations by running multiple servers running the map and reduce functions over different stores of data, then collect the aggregate results. Synchronization this activity is apparently built-in.
Every database lives at a root URL like
http://127.0.0.1:5984/mydatabase, which you create using a PUT action via HTTP. Below this root can exist a number of special URLs that have a particular function. For example, the
mydatabase/_designurl points to the database’s “design document”, which holds the definition of various functions. This is where it starts to get into the weeds for me, and I haven’t learned that much about the particulars yet.
CouchDB can serve web pages too, which are stored as attachments. It has overlapping function with node.js, but I haven’t looked too much into that yet. I believe these are called ‘CouchApps’.
Talking to CouchDB
p>Jason has been working on getting CouchDB to handle user authentication for his math quiz students, which meant understanding not only node.js (which he’s using to host his app), but also understanding how authentication works in a RESTful environment comprised of a client and a server speaking HTTP. This is a pretty big chunk of concept to absorb, so we’ve been spending quite a bit of time going over it. We’ve also been going over debugging techniques to verify that our assumptions about how it works are true. With code running on Node, CouchDB, and the client browser, it’s challenging to find exactly how to inspect what’s going on. It’s taken most of the week to get to the point where Jason’s code is doing exactly what he thinks it should be doing, and it’s been (I think) a good lesson in practical debugging technique with Google at your side. There is a tendency these days, I’ve noticed, for new programmers to Google for a specific answer rather than finding canonical documentation to develop an accurate mental model and test assumptions.
As for me, I’ve yet to write any of my own code, but I’m thinking of using the web-based interface for CouchDB to create a database with my “business operations” in it. Then, I could start to write code that helped me manipulate it. That’s the goal for this Saturday.