Just a little memo about mongodump and mongorestore:
- mongodump : dump all databases into dump directory
- mongorestore -d db_name path_to_db_name_files : insert all data into db_name. use –drop option to drop existing database before
Just a little memo about mongodump and mongorestore:
We describe here a way to work with Embedded Documents and ExtJS Models, especially on the Parsing (GET) part. Embedded documents are a powerful feature of MongoDB, my favorite. A nice alternative to OneToMany SQL Join.
For example, based on a JSON data from a GET query, if we have:
{
"success": true,
"user": [{
"id": 1,
"name": "Philip J. Fry",
"last_post": {
"title": "Post 1", posted:"2015-03-03T12:45:14Z"
}
}]
}
In ExtJS, the models are:
Ext.define('MyApp.model.User', {
extend: 'Ext.data.Model',
fields:[
{ name: 'id', type:'int' }
,{ name: 'name', type:'string' }
,{ name: 'last_post', reference:'Post' }
]
}); // eo MyApp.model.User
and
Ext.define('MyApp.model.Post', {
extend: 'Ext.data.Model',
fields:[
{ name: 'title', type:'string' }
,{ name: 'posted', type:'date' }
]
}); // eo MyApp.model.Post
At this point, even if we could access last_post fields (last_post.title), “date” field is not parsed and so, not transformed into a “Date” object. There are couple of solutions, but my favorite and most efficient is the following: use “convert” function for the “last_post” field.
So, User model becomes:
Ext.define('MyApp.model.User', {
extend: 'Ext.data.Model',
fields:[
{ name: 'id', type:'int' }
,{ name: 'name', type:'string' }
,{ name: 'last_post',
// Here is the trick... =>
convert: function(value) {
return Ext.create("MyApp.model.Post", value).data;
}
}
]
}); // eo MyApp.model.User
By this way, “transformation” is hold by the “convert” function and works efficiently.
References
Quick note: when defining __init__ function for a MongoEngine’ *Document, do not forget to call parent constructor. Example:
from mongoengine import * class MyDocument(Document): def __init__(self): Document.__init__(self) [...]
I have recently tested Django with MongoDB, using MongoEngine to connect Django to MongoDB.
I often use some attributes common to multiple models and I do not like to copy-paste, so I always use class inheritance. This is particularly true with ORM (such as doctrine within Symfony 2 PHP Framework).
For example, my “base document” looks like (no syntax at all)
class BaseDocument - attribute 1 - attribute 2 - attribute 3
and derived documents look like
class Derived1 derived from BaseDocument [...]
With MongoEngine, your models derived from “Document”. For example:
from mongoengine import * class BaseDocument(Document): title = StringField() content = StringField()
If you want to derived from BaseDocument, there are 2 approaches:
class Derived1(BaseDocument): author = StringField()
you’ll get an error. MongoEngine Document is not extended properly.
So far, the best way I have found is :
# Here, BaseDocument is a 'raw' class class BaseDocument(): title = StringField() content = StringField() # the targeted class inherits from MongoEngine's Document and our BaseDocument class Derived1(Document, BaseDocument): author = StringField() def __init__(self): Document.__init__(self) BaseDocument.__init__(self)
And no more error at all and working properly as expected.
Symfony2 is a powerful php framework, backed by doctrine ORM. MongoDB uses a JSON/BSON-storage for its documents.
While using SF2 for REST/JSON requests, JSON exports of DB objects is often required. There are many ways to achieve that from an ORM PoV:
In SF2, I do recommend JMSSerializerBundle : it is simple to install, thanks to Composer, and very easy to use.
For example, to encode data within the controller:
$serializer = JMS\Serializer\SerializerBuilder::create()->build(); $jsonContent = $serializer->serialize($data, 'json');
At this point, $jsonContent is a String. And to send it as an answer:
$response = new Response($jsonContent); $response->headers->set('Content-Type', 'application/json'); return $response;
And that’s all.
To be a little bit more efficient, we can create a new “JsonResponse” class
namespace AB3\ExampleBundle\Internal; use Symfony\Component\HttpFoundation\Response; class JsonResponse extends Response { public static function create($jsonContent) { $response = new Response($jsonContent); $response->headers->set('Content-Type', 'application/json'); return $response; } }
From my daily usage, I’d rather put this ‘create’ method directly in a Root Controller that all my controllers extends, in order to simply common stuffs…
namespace AB3\ExampleBundle\Controller; use Symfony\Component\HttpFoundation\Response; class ARootController extends Controller { public function sendJsonOk($data) { $serializer = JMS\Serializer\SerializerBuilder::create()->build(); $jsonContent = $serializer->serialize($data, 'json'); $response = new Response($jsonContent); $response->headers->set('Content-Type', 'application/json'); return $response; } }
Then, the call is quite simply for any action into a controller
class MyController extends ARootController { public function exampleAction() { $data = ....; // get some datas to send back to user return $this->sendJsonOk($data); } }
I switched an app from MySQL to MongoDB 2 years ago.
First, why? The app purpose is to dispatch work order to people on the field, but also gets back some datas such as reports, pictures, gps coords, action times, an so on.
At the beginning, app was developed for a customer, based on its specifications. As business was growing, every customer has to own needs, and most of the time, its customization. Adding more and more fields whatever the approach on the SQL DB becomes messy.
At this point, we got 2 ways : first, to put every specific stuff into a big text json encoded field (at a point, this is a key feature of postgresql 9.4 with abilty to search within the field). second, to find another way.
MongoDB appears to be a nice choice to get more flexibility.
The Pros:
The cons:
2 years after, here are some feedbacks. For information, we are collecting a lot of GPS points from smartphone on the field, even if some are dropped.
The Pros:
The cons:
The Question : was it a good or bad choice? For a business and cost point of view, this was and still is a really good choice, no way. But, we graph-oriented DB, and other choices (Cassandra, PstgreSQL 9.4, HBase, …), there are many openings to test in the future.