I’m trying to get amount of objects saved in MongoDB with but I’m getting an error I’m using Pymongo 2.0 Answer The find() function for pymongo returns a cursor object (not an array). Pymongo does include a count_documents function. Meaning the code should look like this: Edit: Updated to correct solution.
Tag: mongodb
Right way to do parallel work in python with files and db?
I have very large number of file names (from my PC) inserted in db with status New by default. I want for every file name do some operations (change file). During change file change file status to Proccesing. After operations change status on Processed. I deside to do it with multiprocessing python module. Right now i have this solution but
Mongo db array of images change schema
I have an array of images in mongodb and I am trying to change the schema of the array. Right now the images are stored like bellow And the final output I want is like bellow. How can I do this in mongosh? Is it easier to do this as a Python Array and then import back to mongodb? Thank
“Update” _id in mongodb document
PS: I only have to do this due to business requirements I’m able to achieve it using mongosh, but since there are multiple records to be updated, I’m trying to implement a simple python script to automate the task. Is it possible to do this with pymongodb? I’m not able to set the new Id in the doc variable in
Python Mongo docker-compose Topology Error
I’m trying really simple example of mongo with python and got an error. Dockerfile: run_test.sh docker-compose.yaml: db_test.py: db.py: I’m doing docker-compose up and got this output: So it looks like it can’t connect to the database. I didn’t change ports or something, and another example works like a charm with those settings, so I don’t really know what I’m missing.
$elemMatch to fetch specific values inside array
I have a collection named ‘attendance’ that has an array: I have been trying to query the values of the specific element in my array using $and and $elemMatch in: But it still prints the other section rather than one. I want to output to be: And I tried using the dot notation like: Still no luck. I’m not sure
Perform $gte and $lt on the same field _id in MongoDB
db.comments.find({“_id” : {“$gte”: ObjectId(“6225f932a7bce76715a9f3bd”), “$lt”:ObjectId(“6225f932a7bce76715a9f3bd”)}}).sort({“created_datetime”:1}).limit(10).pretty() I am using this query which should give me the current “6225f932a7bce76715a9f3bd” doc, 4 docs inserted before this and 5 docs inserted after this. But currently when i run this query, i get null result. Where am i going wrong ?? Answer I had no other option but to seperate my queries in order to
Query by computed property in python mongoengine
I wondered if it is possible to query documents in MongoDB by computed properties using mongoengine in python. Currently, my model looks like this: When I do for example SnapshotIndicatorKeyValue .objects().first().snapshot, I can access the snapshotproperty. But when I try to query it, it doesn’t work. For example: I get the error `mongoengine.errors.InvalidQueryError: Cannot resolve field “snapshot”“ Is there any
Conditional call of a FastAPI Model
I have a multilang FastAPI connected to MongoDB. My document in MongoDB is duplicated in the two languages available and structured this way (simplified example): I therefore implemented two models DatasetFR and DatasetEN, each one makeS references with specific external Models (Enum) for category and tags in each lang. In the routes definition I forced the language parameter to declare
How to fix memory error while importing a very large csv file to mongodb in python?
Given below is the code for importing a pipe delimited csv file to monogdb. Below is the error I get when running the above code. I if modify the code with some indents under the for loop. The MongoDB gets imported with the same data all over again without stopping. Answer The memory issue can be solved by inserting one