343
votes

I'd like to get the names of all the keys in a MongoDB collection.

For example, from this:

db.things.insert( { type : ['dog', 'cat'] } );
db.things.insert( { egg : ['cat'] } );
db.things.insert( { type : [] } );
db.things.insert( { hello : []  } );

I'd like to get the unique keys:

type, egg, hello
22

22 Answers

369
votes

You could do this with MapReduce:

mr = db.runCommand({
  "mapreduce" : "my_collection",
  "map" : function() {
    for (var key in this) { emit(key, null); }
  },
  "reduce" : function(key, stuff) { return null; }, 
  "out": "my_collection" + "_keys"
})

Then run distinct on the resulting collection so as to find all the keys:

db[mr.result].distinct("_id")
["foo", "bar", "baz", "_id", ...]
219
votes

With Kristina's answer as inspiration, I created an open source tool called Variety which does exactly this: https://github.com/variety/variety

104
votes

You can use aggregation with the new $objectToArray aggregation operator in version 3.4.4 to convert all top key-value pairs into document arrays, followed by $unwind and $group with $addToSet to get distinct keys across the entire collection. (Use $$ROOT for referencing the top level document.)

db.things.aggregate([
  {"$project":{"arrayofkeyvalue":{"$objectToArray":"$$ROOT"}}},
  {"$unwind":"$arrayofkeyvalue"},
  {"$group":{"_id":null,"allkeys":{"$addToSet":"$arrayofkeyvalue.k"}}}
])

You can use the following query for getting keys in a single document.

db.things.aggregate([
  {"$match":{_id: "<<ID>>"}}, /* Replace with the document's ID */
  {"$project":{"arrayofkeyvalue":{"$objectToArray":"$$ROOT"}}},
  {"$project":{"keys":"$arrayofkeyvalue.k"}}
])
19
votes

A cleaned up and reusable solution using pymongo:

from pymongo import MongoClient
from bson import Code

def get_keys(db, collection):
    client = MongoClient()
    db = client[db]
    map = Code("function() { for (var key in this) { emit(key, null); } }")
    reduce = Code("function(key, stuff) { return null; }")
    result = db[collection].map_reduce(map, reduce, "myresults")
    return result.distinct('_id')

Usage:

get_keys('dbname', 'collection')
>> ['key1', 'key2', ... ]
18
votes

If your target collection is not too large, you can try this under mongo shell client:

var allKeys = {};

db.YOURCOLLECTION.find().forEach(function(doc){Object.keys(doc).forEach(function(key){allKeys[key]=1})});

allKeys;
16
votes

Try this:

doc=db.thinks.findOne();
for (key in doc) print(key);
11
votes

Using python. Returns the set of all top-level keys in the collection:

#Using pymongo and connection named 'db'

reduce(
    lambda all_keys, rec_keys: all_keys | set(rec_keys), 
    map(lambda d: d.keys(), db.things.find()), 
    set()
)
11
votes

If you are using mongodb 3.4.4 and above then you can use below aggregation using $objectToArray and $group aggregation

db.collection.aggregate([
  { "$project": {
    "data": { "$objectToArray": "$$ROOT" }
  }},
  { "$project": { "data": "$data.k" }},
  { "$unwind": "$data" },
  { "$group": {
    "_id": null,
    "keys": { "$addToSet": "$data" }
  }}
])

Here is the working example

9
votes

Here is the sample worked in Python: This sample returns the results inline.

from pymongo import MongoClient
from bson.code import Code

mapper = Code("""
    function() {
                  for (var key in this) { emit(key, null); }
               }
""")
reducer = Code("""
    function(key, stuff) { return null; }
""")

distinctThingFields = db.things.map_reduce(mapper, reducer
    , out = {'inline' : 1}
    , full_response = True)
## do something with distinctThingFields['results']
6
votes

I think the best way do this as mentioned here is in mongod 3.4.4+ but without using the $unwind operator and using only two stages in the pipeline. Instead we can use the $mergeObjects and $objectToArray operators.

In the $group stage, we use the $mergeObjects operator to return a single document where key/value are from all documents in the collection.

Then comes the $project where we use $map and $objectToArray to return the keys.

let allTopLevelKeys =  [
    {
        "$group": {
            "_id": null,
            "array": {
                "$mergeObjects": "$$ROOT"
            }
        }
    },
    {
        "$project": {
            "keys": {
                "$map": {
                    "input": { "$objectToArray": "$array" },
                    "in": "$$this.k"
                }
            }
        }
    }
];

Now if we have a nested documents and want to get the keys as well, this is doable. For simplicity, let consider a document with simple embedded document that look like this:

{field1: {field2: "abc"}, field3: "def"}
{field1: {field3: "abc"}, field4: "def"}

The following pipeline yield all keys (field1, field2, field3, field4).

let allFistSecondLevelKeys = [
    {
        "$group": {
            "_id": null,
            "array": {
                "$mergeObjects": "$$ROOT"
            }
        }
    },
    {
        "$project": {
            "keys": {
                "$setUnion": [
                    {
                        "$map": {
                            "input": {
                                "$reduce": {
                                    "input": {
                                        "$map": {
                                            "input": {
                                                "$objectToArray": "$array"
                                            },
                                            "in": {
                                                "$cond": [
                                                    {
                                                        "$eq": [
                                                            {
                                                                "$type": "$$this.v"
                                                            },
                                                            "object"
                                                        ]
                                                    },
                                                    {
                                                        "$objectToArray": "$$this.v"
                                                    },
                                                    [
                                                        "$$this"
                                                    ]
                                                ]
                                            }
                                        }
                                    },
                                    "initialValue": [

                                    ],
                                    "in": {
                                        "$concatArrays": [
                                            "$$this",
                                            "$$value"
                                        ]
                                    }
                                }
                            },
                            "in": "$$this.k"
                        }
                    }
                ]
            }
        }
    }
]

With a little effort, we can get the key for all subdocument in an array field where the elements are object as well.

6
votes

I am surprise, no one here has ans by using simple javascript and Set logic to automatically filter the duplicates values, simple example on mongo shellas below:

var allKeys = new Set()
db.collectionName.find().forEach( function (o) {for (key in o ) allKeys.add(key)})
for(let key of allKeys) print(key)

This will print all possible unique keys in the collection name: collectionName.

3
votes

This works fine for me:

var arrayOfFieldNames = [];

var items = db.NAMECOLLECTION.find();

while(items.hasNext()) {
  var item = items.next();
  for(var index in item) {
    arrayOfFieldNames[index] = index;
   }
}

for (var index in arrayOfFieldNames) {
  print(index);
}
3
votes

Maybe slightly off-topic, but you can recursively pretty-print all keys/fields of an object:

function _printFields(item, level) {
    if ((typeof item) != "object") {
        return
    }
    for (var index in item) {
        print(" ".repeat(level * 4) + index)
        if ((typeof item[index]) == "object") {
            _printFields(item[index], level + 1)
        }
    }
}

function printFields(item) {
    _printFields(item, 0)
}

Useful when all objects in a collection has the same structure.

1
votes

To get a list of all the keys minus _id, consider running the following aggregate pipeline:

var keys = db.collection.aggregate([
    { "$project": {
       "hashmaps": { "$objectToArray": "$$ROOT" } 
    } }, 
    { "$project": {
       "fields": "$hashmaps.k"
    } },
    { "$group": {
        "_id": null,
        "fields": { "$addToSet": "$fields" }
    } },
    { "$project": {
            "keys": {
                "$setDifference": [
                    {
                        "$reduce": {
                            "input": "$fields",
                            "initialValue": [],
                            "in": { "$setUnion" : ["$$value", "$$this"] }
                        }
                    },
                    ["_id"]
                ]
            }
        }
    }
]).toArray()[0]["keys"];
0
votes

As per the mongoldb documentation, a combination of distinct

Finds the distinct values for a specified field across a single collection or view and returns the results in an array.

and indexes collection operations are what would return all possible values for a given key, or index:

Returns an array that holds a list of documents that identify and describe the existing indexes on the collection

So in a given method one could do use a method like the following one, in order to query a collection for all it's registered indexes, and return, say an object with the indexes for keys (this example uses async/await for NodeJS, but obviously you could use any other asynchronous approach):

async function GetFor(collection, index) {

    let currentIndexes;
    let indexNames = [];
    let final = {};
    let vals = [];

    try {
        currentIndexes = await collection.indexes();
        await ParseIndexes();
        //Check if a specific index was queried, otherwise, iterate for all existing indexes
        if (index && typeof index === "string") return await ParseFor(index, indexNames);
        await ParseDoc(indexNames);
        await Promise.all(vals);
        return final;
    } catch (e) {
        throw e;
    }

    function ParseIndexes() {
        return new Promise(function (result) {
            let err;
            for (let ind in currentIndexes) {
                let index = currentIndexes[ind];
                if (!index) {
                    err = "No Key For Index "+index; break;
                }
                let Name = Object.keys(index.key);
                if (Name.length === 0) {
                    err = "No Name For Index"; break;
                }
                indexNames.push(Name[0]);
            }
            return result(err ? Promise.reject(err) : Promise.resolve());
        })
    }

    async function ParseFor(index, inDoc) {
        if (inDoc.indexOf(index) === -1) throw "No Such Index In Collection";
        try {
            await DistinctFor(index);
            return final;
        } catch (e) {
            throw e
        }
    }
    function ParseDoc(doc) {
        return new Promise(function (result) {
            let err;
            for (let index in doc) {
                let key = doc[index];
                if (!key) {
                    err = "No Key For Index "+index; break;
                }
                vals.push(new Promise(function (pushed) {
                    DistinctFor(key)
                        .then(pushed)
                        .catch(function (err) {
                            return pushed(Promise.resolve());
                        })
                }))
            }
            return result(err ? Promise.reject(err) : Promise.resolve());
        })
    }

    async function DistinctFor(key) {
        if (!key) throw "Key Is Undefined";
        try {
            final[key] = await collection.distinct(key);
        } catch (e) {
            final[key] = 'failed';
            throw e;
        }
    }
}

So querying a collection with the basic _id index, would return the following (test collection only has one document at the time of the test):

Mongo.MongoClient.connect(url, function (err, client) {
    assert.equal(null, err);

    let collection = client.db('my db').collection('the targeted collection');

    GetFor(collection, '_id')
        .then(function () {
            //returns
            // { _id: [ 5ae901e77e322342de1fb701 ] }
        })
        .catch(function (err) {
            //manage your error..
        })
});

Mind you, this uses methods native to the NodeJS Driver. As some other answers have suggested, there are other approaches, such as the aggregate framework. I personally find this approach more flexible, as you can easily create and fine-tune how to return the results. Obviously, this only addresses top-level attributes, not nested ones. Also, to guarantee that all documents are represented should there be secondary indexes (other than the main _id one), those indexes should be set as required.

0
votes

We can achieve this by Using mongo js file. Add below code in your getCollectionName.js file and run js file in the console of Linux as given below :

mongo --host 192.168.1.135 getCollectionName.js

db_set = connect("192.168.1.135:27017/database_set_name"); // for Local testing
// db_set.auth("username_of_db", "password_of_db"); // if required

db_set.getMongo().setSlaveOk();

var collectionArray = db_set.getCollectionNames();

collectionArray.forEach(function(collectionName){

    if ( collectionName == 'system.indexes' || collectionName == 'system.profile' || collectionName == 'system.users' ) {
        return;
    }

    print("\nCollection Name = "+collectionName);
    print("All Fields :\n");

    var arrayOfFieldNames = []; 
    var items = db_set[collectionName].find();
    // var items = db_set[collectionName].find().sort({'_id':-1}).limit(100); // if you want fast & scan only last 100 records of each collection
    while(items.hasNext()) {
        var item = items.next(); 
        for(var index in item) {
            arrayOfFieldNames[index] = index;
        }
    }
    for (var index in arrayOfFieldNames) {
        print(index);
    }

});

quit();

Thanks @ackuser

0
votes

Following the thread from @James Cropcho's answer, I landed on the following which I found to be super easy to use. It is a binary tool, which is exactly what I was looking for: mongoeye.

Using this tool it took about 2 minutes to get my schema exported from command line.

0
votes

I know this question is 10 years old but there is no C# solution and this took me hours to figure out. I'm using the .NET driver and System.Linq to return a list of the keys.

var map = new BsonJavaScript("function() { for (var key in this) { emit(key, null); } }");
var reduce = new BsonJavaScript("function(key, stuff) { return null; }");
var options = new MapReduceOptions<BsonDocument, BsonDocument>();
var result = await collection.MapReduceAsync(map, reduce, options);
var list = result.ToEnumerable().Select(item => item["_id"].ToString());
0
votes

Based on @Wolkenarchitekt answer: https://stackoverflow.com/a/48117846/8808983, I write a script that can find patterns in all keys in the db and I think it can help others reading this thread:

"""
Python 3
This script get list of patterns and print the collections that contains fields with this patterns.
"""

import argparse

import pymongo
from bson import Code


# initialize mongo connection:
def get_db():
    client = pymongo.MongoClient("172.17.0.2")
    db = client["Data"]
    return db


def get_commandline_options():
    description = "To run use: python db_fields_pattern_finder.py -p <list_of_patterns>"
    parser = argparse.ArgumentParser(description=description)
    parser.add_argument('-p', '--patterns', nargs="+", help='List of patterns to look for in the db.', required=True)
    return parser.parse_args()


def report_matching_fields(relevant_fields_by_collection):
    print("Matches:")

    for collection_name in relevant_fields_by_collection:
        if relevant_fields_by_collection[collection_name]:
            print(f"{collection_name}: {relevant_fields_by_collection[collection_name]}")

    # pprint(relevant_fields_by_collection)


def get_collections_names(db):
    """
    :param pymongo.database.Database db:
    :return list: collections names
    """
    return db.list_collection_names()


def get_keys(db, collection):
    """
    See: https://stackoverflow.com/a/48117846/8808983
    :param db:
    :param collection:
    :return:
    """
    map = Code("function() { for (var key in this) { emit(key, null); } }")
    reduce = Code("function(key, stuff) { return null; }")
    result = db[collection].map_reduce(map, reduce, "myresults")
    return result.distinct('_id')


def get_fields(db, collection_names):
    fields_by_collections = {}
    for collection_name in collection_names:
        fields_by_collections[collection_name] = get_keys(db, collection_name)
    return fields_by_collections


def get_matches_fields(fields_by_collections, patterns):
    relevant_fields_by_collection = {}
    for collection_name in fields_by_collections:
        relevant_fields = [field for field in fields_by_collections[collection_name] if
                           [pattern for pattern in patterns if
                            pattern in field]]
        relevant_fields_by_collection[collection_name] = relevant_fields

    return relevant_fields_by_collection


def main(patterns):
    """
    :param list patterns: List of strings to look for in the db.
    """
    db = get_db()

    collection_names = get_collections_names(db)
    fields_by_collections = get_fields(db, collection_names)
    relevant_fields_by_collection = get_matches_fields(fields_by_collections, patterns)

    report_matching_fields(relevant_fields_by_collection)


if __name__ == '__main__':
    args = get_commandline_options()
    main(args.patterns)
-1
votes

I extended Carlos LM's solution a bit so it's more detailed.

Example of a schema:

var schema = {
    _id: 123,
    id: 12,
    t: 'title',
    p: 4.5,
    ls: [{
            l: 'lemma',
            p: {
                pp: 8.9
            }
        },
         {
            l: 'lemma2',
            p: {
               pp: 8.3
           }
        }
    ]
};

Type into the console:

var schemafy = function(schema, i, limit) {
    var i = (typeof i !== 'undefined') ? i : 1;
    var limit = (typeof limit !== 'undefined') ? limit : false;
    var type = '';
    var array = false;

    for (key in schema) {
        type = typeof schema[key];
        array = (schema[key] instanceof Array) ? true : false;

        if (type === 'object') {
            print(Array(i).join('    ') + key+' <'+((array) ? 'array' : type)+'>:');
            schemafy(schema[key], i+1, array);
        } else {
            print(Array(i).join('    ') + key+' <'+type+'>');
        }

        if (limit) {
            break;
        }
    }
}

Run:

schemafy(db.collection.findOne());

Output

_id <number>
id <number>
t <string>
p <number>
ls <object>:
    0 <object>:
    l <string>
    p <object>:
        pp <number> 
-1
votes

I was trying to write in nodejs and finally came up with this:

db.collection('collectionName').mapReduce(
function() {
    for (var key in this) {
        emit(key, null);
    }
},
function(key, stuff) {
    return null;
}, {
    "out": "allFieldNames"
},
function(err, results) {
    var fields = db.collection('allFieldNames').distinct('_id');
    fields
        .then(function(data) {
            var finalData = {
                "status": "success",
                "fields": data
            };
            res.send(finalData);
            delteCollection(db, 'allFieldNames');
        })
        .catch(function(err) {
            res.send(err);
            delteCollection(db, 'allFieldNames');
        });
 });

After reading the newly created collection "allFieldNames", delete it.

db.collection("allFieldNames").remove({}, function (err,result) {
     db.close();
     return; 
});
-3
votes

I have 1 simpler work around...

What you can do is while inserting data/document into your main collection "things" you must insert the attributes in 1 separate collection lets say "things_attributes".

so every time you insert in "things", you do get from "things_attributes" compare values of that document with your new document keys if any new key present append it in that document and again re-insert it.

So things_attributes will have only 1 document of unique keys which you can easily get when ever you require by using findOne()