Wednesday, June 13, 2018

More Porting Data from MongoDB to the MySQL Document Store

Last time we looked at moving a JSON data set from MongoDB to the MySQL Document Store.  Let's move another and then see how to investigate this date.  We will use the primer-dataset.json that contains data on restaurants around New York City.

Loading Data


The loading of the JSON data set was covered last time but here is the gist. The first step is to fire up the MySQL Shell and login to the server.
Here a new schema is created and then a new collection
 We need a new schema for this data and the example shows one created as nyeats.  The within that new schema a collection is created with the name restaurants.


Then we switch to Python mode to load the data with a simple program
Switching to Python mode, a simple program reads the data from a file and loads the collection.

What types of Restaurants

We can quickly look at all the restaurants but it may be easier to start out looking at the types of cuisine. And it would be nice to see the numbers of each type.

The result set:

[
    {
        "$.cuisine": "Polynesian",
        "count('*')": 1
    },
    {
        "$.cuisine": "Café/Coffee/Tea",
        "count('*')": 2
    },
    {
        "$.cuisine": "Cajun",
        "count('*')": 7
    },
...
    {
        "$.cuisine": "Latin (Cuban, Dominican, Puerto Rican, South & Central American)",
        "count('*')": 850
    },
    {
        "$.cuisine": "Other",
        "count('*')": 1011
    }
]
85 documents in set (0.7823 sec)

The big surprise for me was the 1,011 other restaurants after seeing a rather inclusive list of cuisine styles.

What cuisine types are available and their numbers?

Feel like Red Beans and Rice

So lets narrow our search down and look for some Cajun food.  But since we are health conscious we will want to check the health department ratings on the restaurants.
And we can see the names of the restaurants with their latest health department grades.


Next time we will dig deeper into our NYC restaurants