Workshop 10: Adding a Database
- Overview
- Big Picture
- Install MongoDB
- Prepare the Git Repository
- Set Up the Database
- MongoDB Overview
- Facebook Changes
- Populate the Database with Mock Objects
- Change Current User ID to a String
- First Route: GET /user/:userid/feed
- Change Other Routes to Use MongoDB
- Update Comment Routes to Use MongoDB
- Conclusion and Other Resources
- Submission
Overview
In the previous workshop, we added a web server to Facebook, and migrated the mock database from the client into the server. We also added basic HTTP request authentication using unencrypted JSON web tokens.
In this workshop, we will migrate from a mock database to a real database, powered by MongoDB. We will change the server routes to interact with the real database. Best of all, we will only need to make one small client change to fix hardcoded user IDs!
Big Picture
At this point, it may be helpful to remind you all of the “big picture”, and how everything connects up.
In a real web application, the client runs on the user’s computer (in a web browser), while the server and the database run on the company’s servers (typically owned by and leased from a cloud computing company – such as Amazon, Google, or Microsoft):
During development, when you need to work on all of the components of the application, you’ll run all of these pieces on your own laptop/computer:
Install MongoDB
As a prerequisite for this workshop, you need to install MongoDB Community Edition on your computer. Follow the instructions for your operating system below.
Windows
Use this guide to choose which version of MongoDB Community Edition to install. Then, simply run the installer.
To test the installation, run the following in your terminal:
$ "C:\Program Files\MongoDB\Server\3.2\bin\mongod.exe" --version
Mac OSX
There are two ways to install MongoDB Community Edition on the Mac OS X.
Via Homebrew
If you know what Homebrew is and you have it installed, simply run brew install mongodb
and you’re done. Verify the install by running mongod --version
.
If you do not know what Homebrew is, you may want to check it out. We will not support getting Homebrew up and running, though.
Via Tarball
If you do not know what Homebrew is, simply run the following commands from your
class folder to install MongoDB into a mongodb
folder in your class folder:
$ curl -O https://fastdl.mongodb.org/osx/mongodb-osx-x86_64-3.2.4.tgz
$ tar -zxvf mongodb-osx-x86_64-3.2.4.tgz
$ mv mongodb-osx-x86_64-3.2.4/ mongodb
Then, edit your ~/.bash_profile
to include the following line
(change /path/to/
into the path to your class folder). If ~/.bash_profile
does
not exist, simply create the file.
export PATH=/path/to/mongodb/bin:$PATH
Note: You can run atom ~/.bash_profile
from the terminal to launch the Atom editor
to change or create the file.
This change tells the terminal where MongoDB-related terminal commands can be found. The terminal only reads this file when it starts up, though. Close your terminal, and re-open it. Then, verify that everything is working properly by running:
$ mongod --version
Linux
Follow the instructions on MongoDB’s website for your version of Linux.
Once done, verify that the install worked by running mongod --version
.
Prepare the Git Repository
The first thing you need to do is create a fork of the public Git repository.
Next, clone the Git repository into your class folder, cd
into it, then run the following
commands to install the dependencies of both the server and the client:
$ cd client
$ npm install
$ cd ../server
$ npm install
Set Up the Database
Before we can do anything with the database, we need to set it up.
Create Folder for Data
The first step to setting it up is to create a folder that it can store data into.
We do not want the database to store data in our Git repository. Git
repositories are for code, not data! Create a folder named
workshop-10-data
in your class folder. Then, open up the terminal
to your class folder, and start the database with:
$ mongod --dbpath workshop-10-data
Note: If you are on Windows, run "C:\Program Files\MongoDB\Server\3.2\bin\mongod.exe" --dbpath workshop-10-data
instead.
Like with other servers in the course, the database will remain running as long as you have that terminal open. Keep the terminal open for the duration of the workshop.
MongoDB Overview
Let’s compare MongoDB to our mock database. Then, we will walk through writing a simple NodeJS program that interacts with the database.
Documents: Mock vs MongoDB
The structure of a Mongo database is very similar to our mock database. A Mongo database
contains multiple document collections. Each document collection contains JSON documents.
Each document within a collection contains a unique _id
. MongoDB _id
values are
different from the mock database, though; we will explain the difference shortly.
For Facebook, we have the following document collections in the mock database, which we will port over to MongoDB:
- users
- feedItems
- feeds
We will need to change how we treat document IDs, though. By default, MongoDB will generate IDs for new documents called ObjectIDs. ObjectIDs are 12 byte numbers. Normally, you will see ObjectIDs as a hexadecimal string with 24 characters, like this:
"56fda6f887e62a1c0f662e64"
When you send a database object in an HTTP response, its _id
will automatically
convert from an ObjectID to a hex string like the above. The same thing happens when
you console.log
an ObjectID. So, the client will see all document IDs as strings,
and you will probably start to think of them as strings, too.
But… note that ObjectIDs inside the database are not strings. The above is much like the base64 encoding that we used for the JSON web token: It is a string-based representation of arbitrary data. When the client sends a string ID back to us, we will need to convert it back into an ObjectID before asking the database for the document with that ID. When we store a reference to another document into a document, we should store it as an ObjectID and not a string.
Database Operations: Mock vs MongoDB
MongoDB supports all of the same operations that our mock database did (and more!), but all of the operations are asynchronous.
The mock database supports the following basic operations:
addDocument
: Add a new document to a collection.readDocument
: Read a document from a collection.writeDocument
: Update a document in a collection.deleteDocument
: Delete a document from a collection.getCollection
: Retrieve an entire collection.
Similarly, MongoDB supports the following basic operations:
insertOne
/insertMany
: Add a new document to a collection.find
/findOne
: Locate a particular document or set of documents in the database that match a given query.- Identical to
readDocument
when the query is to get a document with a particular_id
- Identical to
getCollection
when the query is to get all documents - Can also use for more complicated queries, e.g. “get me all documents containing this text in the
message
property”
- Identical to
updateOne
/updateMany
/replaceOne
: Update a document in a collection.deleteOne
/deleteMany
: Delete a document (or documents) from a collection.
To perform an operation on a particular collection in the mock database,
you add the name of the collection as the first argument. For example, to add a user to
the 'users'
document collection, you would run:
addDocument('users', userObject);
In MongoDB, you would write the following to add a user to the ‘users’ document
collection, where db
is an open connection to the database:
// Get the 'users' document collection from the database.
var usersCollection = db.collection('users');
// Insert a single user into the collection.
usersCollection.insertOne(userObject, function(err, result) {
if (err) {
// Operation failed.
throw err;
} else {
console.log("Operation completed.");
}
});
Notice how insertOne
takes a callback function. Queries to MongoDB are
asynchronous, much like how queries between the client and server are
asynchronous. Asynchrony is the main reason why JavaScript is so
different from other programming languages you may have used in the past,
like Java or C!
Atomicity: Mock vs MongoDB
So far, we have assumed that multiple interactions with the mock database are atomic. We cannot make that assumption with MongoDB, and will need to be careful with database updates to avoid losing data.
By atomic, I mean that no other database operations jump in-between or interleave with
a written sequence of database operations. For example, if you wrote the following code with
the mock database, you would assume that you are properly updating a user’s name
:
var user = readDocument('users', 4);
user.name = "John Vilk";
writeDocument('users', user);
However, what if, in parallel, the following code was running?
var user = readDocument('users', 4);
user.email = "[email protected]";
writeDocument('users', user);
Do you see the issue?
Assume that both of these code sequences execute readDocument
at the same time, and
they read the user object { _id: 4, name: "Tom", email: "[email protected]" }
. Now, the first code sequence
will try to update the user object with { _id: 4, name: "John Vilk", email: "[email protected]" }
,
and the second will try to update the user object with { _id: 4, name: "Tom", email: "[email protected]" }
.
The final state of the user object depends on which of those updates finishes last.
We can no longer ignore these issues with MongoDB. With MongoDB, we can assume that individual database operations are atomic, but not sequences of operations.
Thankfully, MongoDB supports complex update operations, which will perform complicated object update operations in a single atomic database action. An update operation takes two arguments:
- A filter document which determines which object to update.
- The filter document
{ _id: ObjectID("56fda6f887e62a1c0f662e64") }
matches documents with the specified_id
. - The filter document
{ name: "John Vilk" }
matches documents with thename
property"John Vilk"
.
- The filter document
- An update document which determines how to update the document.
- The update document
{ $set: { name: "John Vilk" } }
changes the matched document’sname
field to"John Vilk"
.$set
is one of MongoDB’s update operators, which you can use in update documents.- You can use
$set
on multiple fields, e.g.{ $set: { name: "John Vilk", email: "[email protected]" } }
. - You can use other update operators in parallel, e.g.
{ $set: { name: "John Vilk" }, $mul: { "income": 10 } }
changes the matched user document’sname
to"John Vilk"
and multiplies itsincome
by10
.
- The update document
{ $push: { likeCounter: userId } }
adds theuserId
to the match object’slikeCounter
array using the$push
operator.
- The update document
With an update operation, we can tell the database to change the email field on the user object with a given ID
in a single database operation. The database will pass us a result
object telling us if anything was actually
updated.
db.collection('users').updateOne({ _id: ObjectID("000000000000000000000004") },
{ $set: { email: "[email protected]" } }, function(err, result) {
if (err) {
// Failed due to a database error
} else {
// Operation succeeded, but this does not mean that any documents were updated!
if (result.modifiedCount === 1) {
// Filter matched at least 1 document, which the database modified.
} else {
// Filter did not match any documents, so no change was made.
// It is likely that this document was removed from the database prior
// to the operation.
}
}
});
Atomicity is primarily an issue when you are writing to the database. It can also be
an issue when reading data, such as trying to resolve the author
of a comment just as that
user deletes their account, but these issues are typically harmless and do not destroy database
state.
Hello, Database!
Let’s write a simple JavaScript program that writes and reads stuff from the database.
Create the file server/src/hellodatabase.js
. Then, import the MongoDB client,
which is just another NodeJS library that you conveniently installed when you
ran npm install
:
var MongoClient = require('mongodb').MongoClient;
Next, we need to tell the MongoClient to connect to our database! By default,
mongod
runs MongoDB on port 27017
. We want to connect to MongoDB, and
work within a database named facebook (MongoDB can host multiple databases
at once).
With that in mind, the URL to your facebook database is mongodb://localhost:27017/facebook
(localhost
is shorthand for your current computer). Let’s connect to it:
// Connect to database named 'facebook'.
var url = 'mongodb://localhost:27017/facebook';
MongoClient.connect(url, function(err, db) {
if (err) {
throw new Error("Could not connect to database: " + err);
} else {
console.log("Connected correctly to server.");
// This is where we will kick off other actions that use the database!
}
});
As you might have guessed, MongoClient.connect
is asynchronous. It kicks
off a connection request to the database, and only calls the callback function
when it succeeds or fails. If your program needs to do anything with
the database connection, you will need to kick it off in the callback
function we provided.
Now, let’s write a function that inserts a simple document into a new document collection called “helloworld”. We do not need to tell MongoDB to create a new document collection at all; it will create it automatically when we tell it to insert a document into it:
/**
* Inserts a simple document into the 'helloworld'
* document collection.
* @param db The database connection
* @param callback A callback function to call once the
* operation completes. We will pass back the new object's
* ID.
*/
function insertExample(db, callback) {
// A document is just a JSON object, like in our mock database.
var exampleDocument = {
message: "Hello, world!"
};
// Insert the example document into collection 'helloworld'.
db.collection('helloworld').insertOne(exampleDocument, function(err, result) {
if (err) {
// Something bad happened, and the insertion failed.
throw err;
} else {
// Success!
console.log("Successfully updated database! The new object's ID is " + result.insertedId);
callback(result.insertedId);
}
});
}
Above, we use the insertOne
function
on the database collection to insert a single document into the database.
When the operation completes (or fails!), the MongoDB NodeJS library calls
the callback we passed to the insertOne
function
with two arguments. The first is an error object;
if this is not null or undefined, then an error occurred. The second is a
results object for the insertOne
operation,
which tells us information about the operation.
In the successful case, we pass the new document’s _id
to the callback passed
to insertExample
. The results object
for the insertOne
operation contains the new document’s _id
as the insertedId
property.
The above code may be surprising to those of you who have worked with databases before. Most databases require you to tell the database what data looks like before it lets you store it. Unlike other databases, MongoDB does not enforce what each document in a collection looks like. This is both a strength, and a weakness:
- Strength: It is easy to store different objects of the same type (isA relations)
- For example, if our Facebook supported multiple types of FeedItems, we could store all of them in the same object collection! In a normal database, we would typically need to store the different types in different tables/collections.
- Weakness: It is easy to store objects into the wrong collection.
- If you typo the collection name: MongoDB will happily insert a document into a new collection called ‘usrs’ instead of ‘users’.
- If you forget a collection name and use the wrong one: MongoDB will happily look for an object in a new collection called ‘user’ instead of ‘users’.
Next, let’s write a function that retrieves an object from the helloworld
document
collection with a particular ID. We can use the findOne
function
to query the database for a single object with that ID:
/**
* Get a document from the helloworld document collection with
* a particular _id.
* @param db The database connection.
* @param id The _id of the object to retrieve.
* @param callback A callback function to run when the operation completes.
* It is called with the requested object.
*/
function getHelloWorldDocument(db, id, callback) {
// Our database query: Find an object with this _id.
var query = {
"_id": id
};
// findOne returns the first object that matches the query.
// Since _id must be unique, there will only be one object that
// matches.
db.collection('helloworld').findOne(query, function(err, doc) {
if (err) {
// Something bad happened.
throw err;
} else {
// Success! If we found the document, then doc contains
// the document. If we did not find the document, doc is
// null.
callback(doc);
}
});
}
Now that we have written these functions, let’s write another function that calls them. Since these functions are asynchronous, we need to nest each step inside the callback of the previous one. This is how you should always deal with asynchronous functions in your own code – the next step should always be kicked off in the callback function of the previous step!
/**
* Add a new document to helloworld collection, read the document,
* print the document.
*/
function mongoExample(db) {
// Step 1: Insert the document.
insertExample(db, function(newId) {
// Step 2: Read the document.
getHelloWorldDocument(db, newId, function(doc) {
// Step 3: Print the document.
console.log("Wrote new object to helloworld collection:");
console.log(doc);
});
});
}
Finally, change the callback that we pass to MongoClient.connect
to call mongoExample
with the db
connection.
Run the program in node (cd server
followed by node src/hellodatabase.js
). You
should see output like this, except with a different ID:
$ node src/hellodatabase.js
Connected correctly to server.
Successfully updated database! The new object's ID is 56feb96697a178291ee6e9dc
Wrote new object to helloworld collection:
{ _id: 56feb96697a178291ee6e9dc, message: 'Hello, world!' }
Node will keep running, though! That is because NodeJS is still connected to the database. You will have to kill it with CTRL+C.
add
server/src/hellodatabase.js
to the Git repository, commit
it with message
mongoExample
, and push
it to GitHub
Facebook Changes
We added the ability to add images to status updates since the last workshop.
To check this out, open a terminal to the client
folder, and run npm run watch
. Then, open another terminal
to the server
folder, and run node src/server.js
. You should now have three terminals open:
- One to run the database
- A second to re-build the client when you change it
- A third to run the server
Visit http://localhost:3000/
In the status update entry widget, click on the camera icon below the text area, and it will bring up a file selection dialog. Select an image, and you will see it in the status update entry area. Type in some text for the status update, click post, and you will see a status update with an image in the feed!
To do this, we had to make the following changes in the client:
StatusUpdateEntry
: Contains the crazy logic needed to use a button for file uploads. Also contains code to read in the selected file as a data URI, and display it back to the user. The image is supplied as the second argument to theonPost
prop.Feed
: Adds theimage
argument to theonPost
function passed toStatusUpdateEntry
, and passes theimage
to thepostStatusUpdate
method inserver
.server.js
: Addsimage
as another property on StatusUpdate entities POSTed to the server.FeedItem
: Passes theimage
prop toStatusUpdate
.StatusUpdate
: Conditionally displays an image, if the status update has one.
On the server, we had to change:
schemas/statusupdate.json
: Added a non-required field calledimage
, which needs to be a string (a data URI).server.js
: ChangepostStatusUpdate
to acceptimage
as an argument, and change thePOST /feeditems
route to passbody.image
topostStatusUpdate
.
There are comments in the code with further details.
Note: If you try to upload larger images with the mock database, it’ll likely be quite slow. When we switch over to MongoDB, it should be fine with images that are less than 16 megabytes large.
Populate the Database with Mock Objects
Now that we have experimented a bit with MongoDB, it is time to move our Facebook mock objects a third time: From the mock database and into the real database!
Initial Import Script
Open up server/database.js
, and copy the var initialData = { .... }
statement. Then, open up src/resetdatabase.js
, and replace var initialData = null
.
Convert Mock Database IDs to MongoDB ObjectIDs
MongoDB uses ObjectIDs as document IDs, which the client will see as string values. Our mock database uses numbers. We need to change those document IDs into ObjectIDs, and then change the client so that it refers to the string value of those ObjectIDs.
An ObjectID can be constructed by passing a 24 character hex string to the ObjectID
constructor, like so:
var ObjectID = require('mongodb').ObjectID;
new ObjectID("000000000000000000000001")
Note: 24 hex characters equals 12 bytes of data.
Unfortunately, there is no simpler way to get the ObjectID for the number “4”; the string must be 24 characters long. ObjectIDs are unwieldy for hardcoded data, but you do not often need to write them out like this.
Now, you need to change all database IDs into ObjectIDs. If you do not, then some equality checks
in our server will fail! (e.g. user._id === new ObjectID(req.params.userid)
)
You need to convert:
- All of the
_id
properties of objects in the database - All references to other objects in the database.
So, the ID 4
will become new ObjectID("000000000000000000000004")
. If you forget a zero on one of these ObjectIDs,
then the resetdatabase.js
script will throw an exception when you run it.
Here are some examples:
// Mock database user entity
{
"_id": 1,
"fullName": "Someone",
"feed": 1
}
// MongoDB database user entity
{
"_id": new ObjectID("000000000000000000000001"),
"fullName": "Someone",
"feed": new ObjectID("000000000000000000000001")
}
// Mock database feeditem
{
"_id": 1,
"likeCounter": [
2, 3
],
"type": "statusUpdate",
"contents": {
"author": 1,
"postDate": 1453668480000,
"location": "Austin, TX",
"contents": "ugh."
},
"comments": [
{
"author": 2,
"contents": "hope everything is ok!",
"postDate": 1453690800000,
"likeCounter": []
},
{
"author": 3,
"contents": "sending hugs your way",
"postDate": 1453690800000,
"likeCounter": []
}
]
}
// MongoDB database feeditem
{
"_id": new ObjectID("000000000000000000000001"),
"likeCounter": [
new ObjectID("000000000000000000000002"), new ObjectID("000000000000000000000003")
],
"type": "statusUpdate",
"contents": {
"author": new ObjectID("000000000000000000000001"),
"postDate": 1453668480000,
"location": "Austin, TX",
"contents": "ugh."
},
"comments": [
{
"author": new ObjectID("000000000000000000000002"),
"contents": "hope everything is ok!",
"postDate": 1453690800000,
"likeCounter": []
},
{
"author": new ObjectID("000000000000000000000003"),
"contents": "sending hugs your way",
"postDate": 1453690800000,
"likeCounter": []
}
]
}
// Mock database feed
{
"_id": 4,
"contents": [2, 1]
}
// MongoDB feed
{
"_id": new ObjectID("000000000000000000000004"),
"contents": [new ObjectID("000000000000000000000002"), new ObjectID("000000000000000000000001")]
}
Once you’ve converted all of the mock objects, double check that you did not forget any document references. If you miss one, you may have weird issues later as you interact with mock objects.
Then, run the resetdatabase script to populate the database with the mock objects:
$ node src/resetdatabase.js
Later on in the workshop, we will have to change a number of other things to adjust to the new ObjectIDs. These items include the JSON token in the client, anything that uses the number 4 for the user ID, and numerical comparisons in server routes (e.g. checking if a user has access to a feeditem).
Checking Imported Documents
Now that we have imported some documents, let’s check that the import succeeded!
Open up server/src/server.js
, and add an import statement for the
mongo-express
Express middleware.
We will use mongo-express
to add a HTTP route to our server that lets you
examine the database’s contents in a web browser:
var mongo_express = require('mongo-express/lib/middleware');
// Import the default Mongo Express configuration
var mongo_express_config = require('mongo-express/config.default.js');
Next, add the following code to configure the server to expose mongo-express
at the URL /mongo_express
:
app.use('/mongo_express', mongo_express(mongo_express_config));
Restart the server. Mongo Express will complain that “authentication failed” and that it could not connect to the database, but ignore it – it’s a spurious warning, as Mongo Express will try again and report “Database db connected”.
Next, head to the URL http://localhost:3000/mongo_express and log in with the username “admin” and password “pass”. You will see two databases: “facebook”, and “local” (MongoDB stores some internal stuff into “local”).
Click on “facebook”. If resetdatabase.js
worked,
you will see document collections for feedItems
, feeds
, and users
.
You will also see helloworld
from the previouse exercise.
Take a screenshot of this screen, and save it as server/mongo_express.png
.
If you click on a document collection, Mongo Express will show you the documents in the collection. Neat!
add
server/mongo_express.png
, commit
with message database-reset
, and push
the commit to GitHub.
Change Current User ID to a String
Before we can update the server to interact with MongoDB, we need to change all of the hardcoded data in the client to use ObjectIDs. Right now, the only hardcoded piece of data in the client is the current user’s ID.
Remember that ObjectIDs are sent to the client as strings, so we only need to
change references to the user id 4
to be the string "000000000000000000000004"
.
Change 4
to "000000000000000000000004"
in the following places:
app/app.js
: Therender
function ofFeedPage
app/server.js
: ThegetFeedData
function; you need to change the route.app/components/comment.js
:didUserLike
function ofComment
app/components/feed.js
:onPost
function ofFeed
app/components/feeditem.js
: In `FeedItem:handleCommentPost
handleLikeClick
(two times)didUserLike
onCommentLike
(two times)
app/components/statusupdate.js
:render
function ofStatusUpdate
(two times)
Next, let’s update the JSON web token! Open up a new terminal session,
type node
to enter the Node REPL like before, and run the following
command to generate a new JSON token:
new Buffer(JSON.stringify({ id: "000000000000000000000004" })).toString('base64');
Update the token
variable at the top of client/app/server.js
to use
the new token you have generated.
Now, Facebook should be entirely broken: The server will refuse the client’s GET request to get the feed data, since it is still coded to expect a numeric user ID in the JSON token. We’ll have to make some changes to the server…
commit
these changes with message stringified
, and push
the changes to GitHub.
First Route: GET /user/:userid/feed
Now that the client is using proper MongoDB document references, we need to start
changing the server to use MongoDB. Let’s start with the most important server route:
GET /user/:userid/feed
, which returns a user’s feed.
Connect to the Database
All of our Express routes are going to need to use the database to read, write, or update data. But in order to use the database, we need to connect to the database first, which requires an asynchronous function call.
Add the following code to server/src/server.js
, and nest all of the server’s helper functions
and HTTP routes into the callback function we supply to connect
:
var MongoDB = require('mongodb');
var MongoClient = MongoDB.MongoClient;
var ObjectID = MongoDB.ObjectID;
var url = 'mongodb://localhost:27017/facebook';
MongoClient.connect(url, function(err, db) {
// Put everything that uses `app` into this callback function.
// from app.use(bodyParser.text());
// all the way to
// app.listen(3000, ...
// Also put all of the helper functions that use mock database
// methods like readDocument, writeDocument, ...
});
// The file ends here. Nothing should be after this.
We put all of the code involving app
into the callback of MongoClient.connect
to make sure that our server doesn’t start until it connects to the database,
and to give all of the HTTP routes access to the db
object passed to the callback.
This object represents our connection to the database.
Redefine getFeedItemSync
Rename getFeedItemSync
to just getFeedItem
,
since we cannot get data from the database synchronously anymore. ESLint will
complain about this change, because getFeedItemSync
is called elsewhere,
but ignore this complaint – we will fix those callsites later!
Next, let’s look at the operations getFeedItem
needs to perform, and figure
out how to map them to MongoDB functionality:
- Retrieve a feed item with the given ID. Like in
hellodatabase.js
, we can usefindOne
to find the feed item with the given ID. - Resolve the user object for the author of the feed item. Similarly, we
could use
findOne
. - Resolve the like counter to user objects. We could use
findOne
on each user ID in the like counter, or we could use the logical OR query operator$or
to get them all in one database request! - Resolve authors on comments to user objects. Same answer as the like counter.
$or
is like ||
in an if
statement in a programming language: It unifies multiple
checks, and matches documents that satisfy at least one of the checks.
Instead of:
if (userObject._id === new ObjectID("000000000000000000000002") ||
userObject._id === new ObjectID("000000000000000000000004")) {
// userObject is a user with ID "000000000000000000000002",
// or "000000000000000000000004"!
}
…you write a query like so:
{
$or: [
{ "_id": new ObjectID("000000000000000000000002") },
{ "_id": new ObjectID("000000000000000000000004") }
]
}
MongoDB also supports logical AND, less than, and greater than.
You can use $or
queries with find
operations to locate particular documents, and
update
operations to perform an update only on documents that match the query.
Since three of the four things we need to do involve looking up user objects. It makes sense to batch resolve all of the user objects in one database query. In other words, we can make a list of all of the user objects we need to resolve for a feed item, resolve them in one query, and then update the FeedItem object with the resolved objects.
Let’s define a helper function called resolveUserObjects
which, given
a list of user IDs and a callback, returns an object containing all of the corresponding
user objects through the callback:
/**
* Resolves a list of user objects. Returns an object that maps user IDs to
* user objects.
*/
function resolveUserObjects(userList, callback) {
// Special case: userList is empty.
// It would be invalid to query the database with a logical OR
// query with an empty array.
if (userList.length === 0) {
callback(null, {});
} else {
// Build up a MongoDB "OR" query to resolve all of the user objects
// in the userList.
var query = {
$or: userList.map((id) => { return {_id: id } })
};
// Resolve 'like' counter
db.collection('users').find(query).toArray(function(err, users) {
if (err) {
return callback(err);
}
// Build a map from ID to user object.
// (so userMap["4"] will give the user with ID 4)
var userMap = {};
users.forEach((user) => {
userMap[user._id] = user;
});
callback(null, userMap);
});
}
}
You may notice that we do not pass a callback into the find
function; instead,
we call toArray
, and pass a callback into that function.
Since find
is an operation that returns multiple documents,
it operates slightly differently than the MongoDB functions we have used so far.
Instead of accepting a single callback that gets all of the results of the
operation, it synchronously returns a cursor object
that you can perform further asynchronous actions on, such as further filtering
the results or performing an action on each item that the database returned.
Above, we call toArray()
on the cursor, which accepts a callback that receives all of the matched documents.
It is likely that this is the only cursor function you will need to use in this
course.
Now, we can change getFeedItem
so that it collects a list of user IDs
it needs to resolve into a list, calls resolveUserObjects
with that list,
and then replaces all of the user IDs with the resolved user objects:
/**
* Resolves a feed item. Internal to the server, since it's synchronous.
* @param feedItemId The feed item's ID. Must be an ObjectID.
* @param callback Called when the operation finishes. First argument is an error object,
* which is null if the operation succeeds, and the second argument is the
* resolved feed item.
*/
function getFeedItem(feedItemId, callback) {
// Get the feed item with the given ID.
db.collection('feedItems').findOne({
_id: feedItemId
}, function(err, feedItem) {
if (err) {
// An error occurred.
return callback(err);
} else if (feedItem === null) {
// Feed item not found!
return callback(null, null);
}
// Build a list of all of the user objects we need to resolve.
// Start off with the author of the feedItem.
var userList = [feedItem.contents.author];
// Add all of the user IDs in the likeCounter.
userList = userList.concat(feedItem.likeCounter);
// Add all of the authors of the comments.
feedItem.comments.forEach((comment) => userList.push(comment.author));
// Resolve all of the user objects!
resolveUserObjects(userList, function(err, userMap) {
if (err) {
return callback(err);
}
// Use the userMap to look up the author's user object
feedItem.contents.author = userMap[feedItem.contents.author];
// Look up the user objects for all users in the like counter.
feedItem.likeCounter = feedItem.likeCounter.map((userId) => userMap[userId]);
// Look up each comment's author's user object.
feedItem.comments.forEach((comment) => {
comment.author = userMap[comment.author];
});
// Return the resolved feedItem!
callback(null, feedItem);
});
});
}
Reminder: Ignore ESLint complaints about getFeedItemSync
not being defined.
We will change other references to getFeedItemSync
later.
While the above function looks different from getFeedItemSync
, notice how
it performs all of the same tasks. This is the theme of this workshop:
redefining all of your database interactions so they do the same thing,
but asynchronously using MongoDB.
Also, notice how the the callback
passed to these functions takes two arguments.
The first is an error, if an error occurs, and the second is the result
of the operation. MongoDB callbacks use the same scheme; the first argument
to the callback is always an error object, which is null
if no error
occurred, and the second argument is the result of the operation.
Redefining getFeedData
Now, we can redefine getFeedData
so it calls getFeedItem
for each
item in the feed. Below, we asynchronously iterate over each document
reference in the user’s feed:
/**
* Get the feed data for a particular user.
* @param user The ObjectID of the user document.
*/
function getFeedData(user, callback) {
db.collection('users').findOne({
_id: user
}, function(err, userData) {
if (err) {
return callback(err);
} else if (userData === null) {
// User not found.
return callback(null, null);
}
db.collection('feeds').findOne({
_id: userData.feed
}, function(err, feedData) {
if (err) {
return callback(err);
} else if (feedData === null) {
// Feed not found.
return callback(null, null);
}
// We will place all of the resolved FeedItems here.
// When done, we will put them into the Feed object
// and send the Feed to the client.
var resolvedContents = [];
// processNextFeedItem is like an asynchronous for loop:
// It performs processing on one feed item, and then triggers
// processing the next item once the first one completes.
// When all of the feed items are processed, it completes
// a final action: Sending the response to the client.
function processNextFeedItem(i) {
// Asynchronously resolve a feed item.
getFeedItem(feedData.contents[i], function(err, feedItem) {
if (err) {
// Pass an error to the callback.
callback(err);
} else {
// Success!
resolvedContents.push(feedItem);
if (resolvedContents.length === feedData.contents.length) {
// I am the final feed item; all others are resolved.
// Pass the resolved feed document back to the callback.
feedData.contents = resolvedContents;
callback(null, feedData);
} else {
// Process the next feed item.
processNextFeedItem(i + 1);
}
}
});
}
// Special case: Feed is empty.
if (feedData.contents.length === 0) {
callback(null, feedData);
} else {
processNextFeedItem(0);
}
});
});
}
Update GET /user/:userid/feed
Now, let’s look at the route for GET /user/:userid/feed
.
First, we need to change some code that expects numeric IDs instead of strings.
The route for GET /user/:userid/feed
expects a numeric user ID
in the JSON web token in these lines:
var userid = req.params.userid;
var fromUser = getUserIdFromToken(req.get('Authorization'));
// userid is a string. We need it to be a number.
var useridNumber = parseInt(userid, 10);
if (fromUser === useridNumber) {
Simplify these lines to the following, which cuts out the integer parsing code:
var userid = req.params.userid;
var fromUser = getUserIdFromToken(req.get('Authorization'));
if (fromUser === userid) {
Next, look at getUserIdFromToken
, which extracts a user ID from a
JSON web token. This function expects a numeric
user ID in these lines:
// Check that id is a number.
if (typeof id === 'number') {
return id;
} else {
// Not a number. Return -1, an invalid ID.
return -1;
}
Change these lines to expect a string, and to return an empty string when invalid:
// Check that id is a string.
if (typeof id === 'string') {
return id;
} else {
// Not a number. Return "", an invalid ID.
return "";
}
Now, we can return to the route for GET /user/:userid/feed
,
and use getFeedData
in the route to get the data it needs:
/**
* Get the feed data for a particular user.
*/
app.get('/user/:userid/feed', function(req, res) {
var userid = req.params.userid;
var fromUser = getUserIdFromToken(req.get('Authorization'));
if (fromUser === userid) {
// Convert userid into an ObjectID before passing it to database queries.
getFeedData(new ObjectID(userid), function(err, feedData) {
if (err) {
// A database error happened.
// Internal Error: 500.
res.status(500).send("Database error: " + err);
} else if (feedData === null) {
// Couldn't find the feed in the database.
res.status(400).send("Could not look up feed for user " + userid);
} else {
// Send data.
res.send(feedData);
}
});
} else {
// 403: Unauthorized request.
res.status(403).end();
}
});
Now, if you restart the server and head to http://localhost:3000/
,
you should see the Facebook feed like normal! Note that other interactions
with the page will not work properly, since we are only using the real
database on one HTTP route. In addition, Facebook will hide the “delete” and
“edit” menu items from status updates you authored.
We will address these issues in future steps.
commit
these changes with message feed-database
, and push
the changes to GitHub
Change Other Routes to Use MongoDB
Now that we have completely converted one HTTP route to use MongoDB, we can update the others! We will tackle all of the HTTP routes in this section except for comment-based routes; you will need to make those changes yourself in the final step of the workshop.
POST /feeditem
POST /feeditem
is the HTTP route that posts a new status update to Facebook.
The core part of this route is implemented in the postStatusUpdate
helper function, so we will focus on that helper function first.
postStatusUpdate
performs the following operations:
- Adds the status update to the
feedItems
document collection.- We can use
insertOne
to do this.
- We can use
- Gets the author’s user object.
- We can use
findOne
to do this.
- We can use
- Updates the author’s feed to include the status update.
- We can use
updateOne
to do this using a$push
update operator.
- We can use
- Returns the new status update.
Let’s talk a bit more about the $push
update operator. Like JavaScript’s
push
on arrays, $push
adds elements to the end of an array. If you look
at the previous code of postStatusUpdate
, we used unshift
on the JavaScript array instead to
add elements to the front of the array.
MongoDB does not have an $unshift
operator yet,
but you can emulate it by performing a MongoDB $push
at the start of the array
using the $position
operator:
$push: {
contents: {
$each: [ newStatusUpdateID ],
$position: 0
}
}
We will have to perform each of these operations asynchronously, nested inside the callback of the previous step.
/**
* Adds a new status update to the database.
* @param user ObjectID of the user.
*/
function postStatusUpdate(user, location, contents, image, callback) {
// Get the current UNIX time.
var time = new Date().getTime();
// The new status update. The database will assign the ID for us.
var newStatusUpdate = {
"likeCounter": [],
"type": "statusUpdate",
"contents": {
"author": user,
"postDate": time,
"location": location,
"contents": contents,
"image": image
},
// List of comments on the post
"comments": []
};
// Add the status update to the database.
db.collection('feedItems').insertOne(newStatusUpdate, function(err, result) {
if (err) {
return callback(err);
}
// Unlike the mock database, MongoDB does not return the newly added object
// with the _id set.
// Attach the new feed item's ID to the newStatusUpdate object. We will
// return this object to the client when we are done.
// (When performing an insert operation, result.insertedId contains the new
// document's ID.)
newStatusUpdate._id = result.insertedId;
// Retrieve the author's user object.
db.collection('users').findOne({ _id: user }, function(err, userObject) {
if (err) {
return callback(err);
}
// Update the author's feed with the new status update's ID.
db.collection('feeds').updateOne({ _id: userObject.feed },
{
$push: {
contents: {
$each: [newStatusUpdate._id],
$position: 0
}
}
},
function(err) {
if (err) {
return callback(err);
}
// Return the new status update to the application.
callback(null, newStatusUpdate);
}
);
});
});
}
Now, you can change the POST /feeditem
route to use the new helper function.
Notice how the logic is essentially the same, except we moved some logic into
the callback argument of postStatusUpdate
:
//`POST /feeditem { userId: user, location: location, contents: contents }`
app.post('/feeditem', validate({ body: StatusUpdateSchema }), function(req, res) {
// If this function runs, `req.body` passed JSON validation!
var body = req.body;
var fromUser = getUserIdFromToken(req.get('Authorization'));
// Check if requester is authorized to post this status update.
// (The requester must be the author of the update.)
if (fromUser === body.userId) {
postStatusUpdate(new ObjectID(fromUser), body.location, body.contents, body.image, function(err, newUpdate) {
if (err) {
// A database error happened.
// 500: Internal error.
res.status(500).send("A database error occurred: " + err);
} else {
// When POST creates a new resource, we should tell the client about it
// in the 'Location' header and use status code 201.
res.status(201);
res.set('Location', '/feeditem/' + newUpdate._id);
// Send the update!
res.send(newUpdate);
}
});
} else {
// 401: Unauthorized.
res.status(401).end();
}
});
Finally, we need to update the JSON schema for status updates to accept
status updates with a string user ID instead of a number user ID.
Update server/src/schemas/statusupdate.json
so that the type
of
"userId"
is a "string"
.
Restart the server. If you did everything correctly, you should be able to post status updates!
PUT /feeditem/:feeditemid/likelist/:userid
This route is responsible for “liking” a feed item. It performs the following database operations:
- Gets the feed item object.
- We can use
findOne
to do this.
- We can use
- Updates the feed’s like counter to contain the specified user ID.
- We can use
updateOne
to do this, along with the$addToSet
operator.
- We can use
- Retrieves the user object for every user ID in the like counter, and returns it to the client.
- We can re-use the
resolveUserObjects
helper function we defined earlier.
- We can re-use the
$addToSet
checks if a given item is already in an array. If it is, it does nothing.
If the item is missing from the array, it adds the item to the array. It is the perfect
operator to update the like counter!
This route also currently uses parseInt
on the feeditemid
and userid
.
We should remove that logic, as document IDs are now strings.
It actually makes sense to update the like counter first, before we get the feed item object. That way, we get the feed item with the updated like counter
The final route looks like this:
/**
* Helper function: Sends back HTTP response with error code 500 due to
* a database error.
*/
function sendDatabaseError(res, err) {
res.status(500).send("A database error occurred: " + err);
}
// `PUT /feeditem/feedItemId/likelist/userId` content
app.put('/feeditem/:feeditemid/likelist/:userid', function(req, res) {
var fromUser = getUserIdFromToken(req.get('Authorization'));
var feedItemId = new ObjectID(req.params.feeditemid);
var userId = req.params.userid;
if (fromUser === userId) {
// First, we can update the like counter.
db.collection('feedItems').updateOne({ _id: feedItemId },
{
// Add `userId` to the likeCounter if it is not already
// in the array.
$addToSet: {
likeCounter: new ObjectID(userId)
}
}, function(err) {
if (err) {
return sendDatabaseError(res, err);
}
// Second, grab the feed item now that we have updated it.
db.collection('feedItems').findOne({ _id: feedItemId }, function(err, feedItem) {
if (err) {
return sendDatabaseError(res, err);
}
// Return a resolved version of the likeCounter
resolveUserObjects(feedItem.likeCounter, function(err, userMap) {
if (err) {
return sendDatabaseError(res, err);
}
// Return a resolved version of the likeCounter
res.send(feedItem.likeCounter.map((userId) => userMap[userId]));
});
}
);
});
} else {
// 401: Unauthorized.
res.status(401).end();
}
});
Restart the server. You should now be able to “Like” status updates, but not “Unlike” them!
DELETE /feeditem/:feeditemid/likelist/:userid
This route is responsible for “unliking” a feed item.
This route is similar to the previous route, except it removes a user
from a Feed Item’s likeCounter
.
This route performs the following database operations:
- Gets the feed item object.
- We can use
findOne
to do this.
- We can use
- Removes the specified user ID from the feed item’s
likeCounter
- We can use
updateOne
to do this, along with the$pull
operator.
- We can use
- Retrieves the user object for every user ID in the like counter, and returns it to the client.
- We can re-use the
resolveUserObjects
helper function we defined earlier.
- We can re-use the
The $pull
operator removes objects from an array that match a particular condition. Here, we want to
remove a particular ObjectID from the likeCounter
.
As in the previous HTTP route, we should update the likeCounter before reading the FeedItem from the database, which gets us the FeedItem with the updated likeCounter.
We also need to remove parseInt
logic from this function, as the userId and feedItemId
should be left as strings.
The final route looks like this:
// Unlike a feed item.
app.delete('/feeditem/:feeditemid/likelist/:userid', function(req, res) {
var fromUser = getUserIdFromToken(req.get('Authorization'));
var feedItemId = new ObjectID(req.params.feeditemid);
var userId = req.params.userid;
if (fromUser === userId) {
// Step 1: Remove userId from the likeCounter.
db.collection('feedItems').updateOne({ _id: feedItemId },
{
// Only removes the userId from the likeCounter, if it is in the likeCounter.
$pull: {
likeCounter: new ObjectID(userId)
}
}, function(err) {
if (err) {
return sendDatabaseError(res, err);
}
// Step 2: Get the feed item.
db.collection('feedItems').findOne({ _id: feedItemId }, function(err, feedItem) {
if (err) {
return sendDatabaseError(res, err);
}
// Step 3: Resolve the user IDs in the like counter into user objects.
resolveUserObjects(feedItem.likeCounter, function(err, userMap) {
if (err) {
return sendDatabaseError(res, err);
}
// Return a resolved version of the likeCounter
res.send(feedItem.likeCounter.map((userId) => userMap[userId]));
});
});
});
} else {
// 401: Unauthorized.
res.status(401).end();
}
});
Restart the server. You should now be able to “Unlike” status updates!
PUT /feeditem/:feeditemid/content
This route is responsible for editing a feed item.
This route performs the following database operations:
- Gets the feed item, and checks if its author matches the authenticated user.
- We can specify this constraint in a filter for
updateOne
.
- We can specify this constraint in a filter for
- Updates the feed item’s contents.
- We can use
updateOne
to do this, along with the$set
operator.
- We can use
- Returns the feed item, with all references resolved to objects.
- We can use
getFeedItem
to do this.
- We can use
Since updateOne
accepts both a filter and an update, we can specify the filter such
that the update only applies to a feed item with the given feed item id and authored
by the authenticated user. This filter looks like this:
{
_id: feedItemId,
// This is how you specify fields on embedded document.
// contents.author is the author field of the embedded status
// update document stored in contents
"contents.author": fromUser
}
The final code looks like the following:
// `PUT /feeditem/feedItemId/content newContent`
app.put('/feeditem/:feeditemid/content', function(req, res) {
var fromUser = new ObjectID(getUserIdFromToken(req.get('Authorization')));
var feedItemId = new ObjectID(req.params.feeditemid);
// Only update the feed item if the author matches the currently authenticated
// user.
db.collection('feedItems').updateOne({
_id: feedItemId,
// This is how you specify nested fields on the document.
"contents.author": fromUser
}, { $set: { "contents.contents": req.body } }, function(err, result) {
if (err) {
return sendDatabaseError(res, err);
} else if (result.modifiedCount === 0) {
// Could not find the specified feed item. Perhaps it does not exist, or
// is not authored by the user.
// 400: Bad request.
return res.status(400).end();
}
// Update succeeded! Return the resolved feed item.
getFeedItem(feedItemId, function(err, feedItem) {
if (err) {
return sendDatabaseError(res, err);
}
res.send(feedItem);
});
});
});
Restart the server. You should now be able to edit status updates that you author!
DELETE /feeditem/:feeditemid
This route deletes a feed item. It performs the following database operations:
- Gets the feed item, and checks if the authenticated user is the author.
- We can use
findOne
to perform this check.
- We can use
- Deletes the feed item.
- We can use
deleteOne
to delete this feed item.
- We can use
- Removes all references to the feed item from all feeds in the database.
- We can use
update
along with$pull
to remove the feed item from all feeds.
- We can use
This order of operations is problematic, as deleting the feed item before removing the feed item from feeds will cause feeds to contain a dangling reference. In other words, some feeds will reference a feed item that no longer exists!
To fix this issue, we need to perform the permissions check first, followed by removing the feed item from all feeds, and finishing with deleting the feed item itself.
The final code for this route looks like the following:
// `DELETE /feeditem/:id`
app.delete('/feeditem/:feeditemid', function(req, res) {
var fromUser = new ObjectID(getUserIdFromToken(req.get('Authorization')));
var feedItemId = new ObjectID(req.params.feeditemid);
// Check if authenticated user has access to delete the feed item.
db.collection('feedItems').findOne({
_id: feedItemId,
"contents.author": fromUser
}, function(err, feedItem) {
if (err) {
return sendDatabaseError(res, err);
} else if (feedItem === null) {
// Could not find the specified feed item. Perhaps it does not exist, or
// is not authored by the user.
// 400: Bad request.
return res.status(400).end();
}
// User authored the feed item!
// Remove feed item from all feeds using $pull and a blank filter.
// A blank filter matches every document in the collection.
db.collection('feeds').updateMany({}, {
$pull: {
contents: feedItemId
}
}, function(err) {
if (err) {
return sendDatabaseError(res, err);
}
// Finally, remove the feed item.
db.collection('feedItems').deleteOne({
_id: feedItemId
}, function(err) {
if (err) {
return sendDatabaseError(res, err);
}
// Send a blank response to indicate success.
res.send();
});
});
});
});
Restart the server. You should now be able to delete feed items if they are authored by you.
POST /search
This route returns all of the status updates in the current user’s feed that contains the search text.
The route currently performs the following database operations:
- Gets the user
- Simple enough: We can use
findOne
.
- Simple enough: We can use
- Gets the user’s feed
- We can use
findOne
for this operation, too.
- We can use
- Finds feed items in the user’s feed that contains the search text
- We can use
find
with a complex query, using the$or
operator to limit the search to feed items whose_id
is in the user’s feed, and using the$text
operator to find feed items containing the specified query text.
- We can use
MongoDB has a $text
query operator that performs text search, but it only works with “fields indexed with a text index”.
If we want to use it on status update text, we need to add a text index to the document
collection that indexes the field "contents.contents"
.
It is best if we apply the index when we create the object collections in the first place.
Open up server/src/resetdatabase.js
, and add the following function to the file:
/**
* Adds any desired indexes to the database.
*/
function addIndexes(db, cb) {
db.collection('feedItems').createIndex({ "contents.contents": "text" }, null, cb);
}
Then, in resetDatabase
, change the else
condition in processNextCollection
so it
calls addIndexes
with the callback instead of calling the callback directly:
else {
addIndexes(db, cb);
}
Now, run node server/src/resetdatabase.js
to reset the database.
With that out of the way, we can finally write the HTTP route for /search
!
//`POST /search queryText`
app.post('/search', function(req, res) {
var fromUser = new ObjectID(getUserIdFromToken(req.get('Authorization')));
if (typeof(req.body) === 'string') {
// trim() removes whitespace before and after the query.
// toLowerCase() makes the query lowercase.
var queryText = req.body.trim().toLowerCase();
// Get the user.
db.collection('users').findOne({ _id: fromUser}, function(err, userData) {
if (err) {
return sendDatabaseError(res, err);
} else if (userData === null) {
// User not found.
// 400: Bad request.
res.status(400).end();
}
// Get the user's feed.
db.collection('feeds').findOne({ _id: userData.feed }, function(err, feedData) {
if (err) {
return sendDatabaseError(res, err);
}
// Look for feed items within the feed that contain queryText.
db.collection('feedItems').find({
$or: feedData.contents.map((id) => { return { _id: id }}),
$text: {
$search: queryText
}
}).toArray(function(err, items) {
if (err) {
return sendDatabaseError(res, err);
}
// Resolve all of the feed items.
var resolvedItems = [];
var errored = false;
function onResolve(err, feedItem) {
if (errored) {
return;
} else if (err) {
errored = true;
sendDatabaseError(res, err);
} else {
resolvedItems.push(feedItem);
if (resolvedItems.length === items.length) {
// Send resolved items to the client!
res.send(resolvedItems);
}
}
}
// Resolve all of the matched feed items in parallel.
for (var i = 0; i < items.length; i++) {
// Would be more efficient if we had a separate helper that
// resolved feed items from their objects and not their IDs.
// Not a big deal in our small applications, though.
getFeedItem(items[i]._id, onResolve);
}
// Special case: No results.
if (items.length === 0) {
res.send([]);
}
});
});
});
} else {
// 400: Bad Request.
res.status(400).end();
}
});
Restart the server. You should now be able to search for feed items!
POST /resetdb
We can reset the database by running node src/resetdatabase.js
, but it would be nice
to have our friend the Reset DB button working again.
Fortunately, you can! Import the ResetDatabase
function from ./resetdatabase.js
like
so:
var ResetDatabase = require('./resetdatabase');
Then, change the /resetdb
route so it calls this function with the db
:
// Reset the database.
app.post('/resetdb', function(req, res) {
console.log("Resetting database...");
ResetDatabase(db, function() {
res.send();
});
});
Restart the server. You should now be able to click on the “Reset DB” button, and it should reset the feed to its initial state!
commit
your changes as databased
and push
them to GitHub.
Update Comment Routes to Use MongoDB
As you might have guessed, it’s now your turn to try your hand at updating HTTP routes to use the database. Specifically, you need to convert the following HTTP routes so that they use MongoDB:
POST /feeditem/:feeditemid/comments
PUT /feeditem/:feeditemid/comments/:commentindex/likelist/:userid
DELETE /feeditem/:feeditemid/comments/:commentindex/likelist/:userid
Here are some tips you should keep in mind:
commentindex
is still a number. It is not a database_id
, it is literally the comment’s array index in the feed item’scomments
array!- User IDs and feed item IDs should no longer be parsed as numbers.
- Remember to convert user IDs and feed item IDs given to you by the client into ObjectIDs before querying the database!
- You will need to adjust the JSON schema for comments.
- Remember that you can visit
http://localhost:3000/mongo_express
to look at the data in the database! - In MongoDB, you reference array elements using the index as a property name. The link discusses this further, with some sample code.
commit
your changes as comments
, and push
them to GitHub.
Conclusion and Other Resources
Our Facebook application is now running across three separate programs:
- The web browser, which runs the client.
- Node.JS, which runs the server.
- MongoDB, which runs the database.
The only limitation in our application is that we have hardcoded the user. In the next workshop, we will explore lifting that limitation!
Further Reading
Just about all that you need to know about MongoDB can be found here:
- Documentation for MongoDB Node.JS driver
- Awesome getting-started resource with code snippets and links to further documentation.
Submission
You must submit the URL of your Workshop6 GitHub repository to Moodle. Visit Moodle, find the associated Workshop 9 activity, and provide your URL. Make sure your Workshop6 repository is public so we can clone your repository and evaluate your work. Submitting the URL for this assignment is part of completing the work.