Monday, June 17, 2013

Efficient SQL queries on MongoDB

How do you integrate MongoDB with other data in your organization? MongoDB is great for building applications, and it has its own powerful query API, but it's difficult to mash up data between MongoDB and other tools, or to make tools that speak SQL, such as Pentaho Analysis (Mondrian), connect to MongoDB.

Building a SQL interface isn't easy, because MongoDB's data model is such a long way from SQL's model. Here are some of the challenges:
  • MongoDB doesn't have a schema. Each database has a number of named 'collections', which are the nearest thing to a SQL table, but each row in a collection can have a completely different set of columns.
  • In MongoDB, data can be nested. Each row consists of a number of fields, and each field can be a scalar value, null, a record, or an array of records.
  • MongoDB supports a number of relational operations, but doesn't use the same terminology as SQL: the find method supports the equivalent of SELECT and WHERE, while the aggregate method supports the equivalent of SELECT, WHERE, GROUP BY, HAVING and ORDER BY.
  • For efficiency, it's really important to push as much of the processing down to MongoDB's query engine, without the user having to re-write their SQL.
  • But MongoDB doesn't support anything equivalent to JOIN.
  • MongoDB can't access external data.

I decided to tackle this using Optiq. Optiq already has a SQL parser and a powerful query optimizer that is powered by rewrite rules. Building on Optiq's core rules, I can add rules that map tables onto MongoDB collections, and relational operations onto MongoDB's find and aggregate operators.

What I produced is a effectively a JDBC driver for MongoDB. Behind it is a hybrid query-processing engine that pushes as much of the query processing down to MongoDB, and does whatever is left (such as joins) in the client.

Let's give it a try. First, install MongoDB, and import MongoDB's zipcode data set:
$ curl -o /tmp/zips.json
$ mongoimport --db test --collection zips --file /tmp/zips.json
Tue Jun  4 16:24:14.190 check 9 29470
Tue Jun  4 16:24:14.469 imported 29470 objects
Log into MongoDB to check it's there:
$ mongo
MongoDB shell version: 2.4.3
connecting to: test
> db.zips.find().limit(3)
{ "city" : "ACMAR", "loc" : [ -86.51557, 33.584132 ], "pop" : 6055, "state" : "AL", "_id" : "35004" }
{ "city" : "ADAMSVILLE", "loc" : [ -86.959727, 33.588437 ], "pop" : 10616, "state" : "AL", "_id" : "35005" }
{ "city" : "ADGER", "loc" : [ -87.167455, 33.434277 ], "pop" : 3205, "state" : "AL", "_id" : "35006" }
> exit
Now let's see the same data via SQL. Download and install Optiq:
$ git clone
$ mvn install
Optiq comes with a sample model in JSON format, and the sqlline SQL shell. Connect using the mongo-zips-model.json Optiq model, and use sqlline's !tables command to list the available tables.
$ ./sqlline
sqlline> !connect jdbc:optiq:model=mongodb/target/test-classes/mongo-zips-model.json admin admin
Connecting to jdbc:optiq:model=mongodb/target/test-classes/mongo-zips-model.json
Connected to: Optiq (version 0.4.13)
Driver: Optiq JDBC Driver (version 0.4.13)
Autocommit status: true
sqlline> !tables
| null       | mongo_raw    | zips            | TABLE         |
| null       | mongo_raw    | system.indexes  | TABLE         |
| null       | mongo        | ZIPS            | VIEW          |
| null       | metadata     | COLUMNS         | SYSTEM_TABLE  |
| null       | metadata     | TABLES          | SYSTEM_TABLE  |
Each collection in MongoDB appears here as a table. There are also the COLUMNS and TABLES system tables provided by Optiq, and a view called ZIPS defined in mongo-zips-model.json.

Let's try a simple query. How many zip codes in America?
sqlline> SELECT count(*) FROM zips;
| EXPR$0  |
| 29467   |
1 row selected (0.746 seconds
Now a more complex one. How many states have a city called Springfield?
sqlline> SELECT count(DISTINCT state) AS c FROM zips WHERE city = 'SPRINGFIELD';
|   C |
| 20  |
1 row selected (0.549 seconds)
Let's use the SQL EXPLAIN command to see how the query is implemented.
sqlline> !set outputformat csv
. . . .> SELECT count(DISTINCT state) AS c FROM zips WHERE city = 'SPRINGFIELD';
'PLAN' 'EnumerableAggregateRel(group=[{}], C=[COUNT($0)]) EnumerableAggregateRel(group=[{0}]) EnumerableCalcRel(expr#0..4=[{inputs}], expr#5=['SPRINGFIELD'], expr#6=[=($t0, $t5)], STATE=[$t3], $condition=[$t6]) MongoToEnumerableConverter MongoTableScan(table=[[mongo_raw, zips]], ops=[[<{city: 1, state: 1, _id: 1}, {$project ...}>]]) ' 1 row selected (0.115 seconds)

The last line of the plan shows that Optiq calls MongoDB's find operator asking for the "city", "state" and "_id" fields. The first three lines of the plan show that the filter and aggregation are implemented using in Optiq's built-in operators, but we're working on pushing them down to MongoDB.

Finally, quit sqlline.
sqlline> !quit
Closing: net.hydromatic.optiq.jdbc.FactoryJdbc41$OptiqConnectionJdbc41

Optiq and its MongoDB adapter shown here are available on github. If you are interested in writing your own adapter, check out optiq-csv, a sample adapter for Optiq that makes CSV files appear as tables. It has own tutorial on writing adapters.

Check back at this blog over the next few months, and I'll show how to write views and advanced queries using Optiq, and how to use Optiq's other adapters.


crorella said...

Impressive work!

KeithS said...

We're on a similar wavelength. Check out

Enzo said...

Hi Julian

Great stuff, but I coudn't make it work.
See what I got - any help is welcome. Thanks! Enzo
sqlline> !connect jdbc:optiq:model=mongodb/target/test-classes/mongo-zips-model.json admin admin
java.lang.RuntimeException: Error instantiating JsonCustomSchema(name=mongo_raw)
at net.hydromatic.optiq.model.ModelHandler.visit(
at net.hydromatic.optiq.model.JsonCustomSchema.accept(
at net.hydromatic.optiq.model.ModelHandler.visit(
at net.hydromatic.optiq.model.ModelHandler.(
at net.hydromatic.optiq.jdbc.Driver$1.onConnectionInit(
at net.hydromatic.optiq.jdbc.UnregisteredDriver.connect(
at sqlline.SqlLine$DatabaseConnection.connect(
at sqlline.SqlLine$DatabaseConnection.getConnection(
at sqlline.SqlLine$Commands.connect(
at sqlline.SqlLine$Commands.connect(
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(
at sun.reflect.DelegatingMethodAccessorImpl.invoke(
at java.lang.reflect.Method.invoke(
at sqlline.SqlLine$ReflectiveCommandHandler.execute(
at sqlline.SqlLine.dispatch(
at sqlline.SqlLine.begin(
at sqlline.SqlLine.mainWithInputRedirection(
at sqlline.SqlLine.main(
Caused by: java.lang.ClassNotFoundException: net.hydromatic.optiq.impl.mongodb.MongoSchemaFactory
at Method)
at java.lang.ClassLoader.loadClass(
at sun.misc.Launcher$AppClassLoader.loadClass(
at java.lang.ClassLoader.loadClass(
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(
at net.hydromatic.optiq.model.ModelHandler.visit(
... 18 more
0: jdbc:optiq:model=mongodb/target/test-class> !quit
Connection is already closed.

Julian Hyde said...

Oops. The recent changes in directory structure broke things. I've fixed them now. Pull the latest, remove .classpath.txt, and run sqlline again.

Note that the model path has changed. The line is now

!connect jdbc:optiq:model=mongodb/target/test-classes/mongo-zips-model.json admin admin

Admin said...

Can this be used to connect to mongodb://localhost/databasename ??

Can it directly connect to mongoDB instead of a model file ?

What is the sqlline command ?

Julian Hyde said...

Can this be used to connect to mongodb://localhost/databasename ??

Yes. You can specify the host and database name in the model file.

The link to the model file was broken in the post. It is now, and I have corrected the post.

Can it directly connect to mongoDB instead of a model file ?

Not at present, but it makes sense. I have logged

What is the sqlline command ?

It is described in the post. "./sqlline" from the shell, followed by "!connect" from within sqlline.

Gon Per said...

I was trying this example on windows with Cygwin, however sqlline doesn't seem too work, I enter in the shell normally but then none of the commands do anything at all. It just gets stuck.

Any idea why?

Julian Hyde said...

Sounds like an issue with sqlline on Windows/cygwin. Can you log it, please.

Julian Hyde said...
This comment has been removed by the author.
Julian Hyde said...

Using the latest optiq (commit 5a4126 or later) you should be able to connect from both cygwin (using sqlline) and native windows (using sqlline.bat). If you still have errors, please log an issue.

vj said...

Hi Julian,

Can we perform OLAP analysis using Mondrain OLAP4j after converting the collections on MongoDb to Tables

Julian Hyde said...

Yes. optiq-on-MongoDB is just a regular JDBC data source as far as Mondrian is concerned.

Nick Hamby said...

If I had a different JSON file I wanted to query, how would I go about generating a model file for it? Im interested in using optiq to query mongo databases and need a way to adapt optiq to a variety of JSON files.

Julian Hyde said...

If the data is outside MongoDB, you could use optiq-csv, to which we recently added support for JSON files.

But within MongoDB, I suggest that you load each file into a different MongoDB collection in the same MongoDB database. From Optiq each MongoDB collection automatically appears as a SQL table with a column called _MAP. (In the example, that table is called "mongo_raw"."zips".) You can create views on those tables, similar to the "mongo"."zips" view in the example, or you can query the raw table.

Nick Hamby said...

So does that mean there is a command I can execute via optiq-csv to generate a model around a .json file? I checked through the tutorial and readme and was unable to find a way to access data without having a model file built by hand first.

Julian Hyde said...

If all those .json files are in the same directory, optiq-csv's schema adapter will make them automatically appear in a schema as tables.

Building views onto those tables with columns of particular names and types is a manual task. You will need to edit the model file by hand.

If you have in mind a tool that would help further, log a jira case describing it.

Nick Hamby said...

I think this example may have lost functionality, when trying to select loc[0] or loc[1] the two floats return ass null in all rows