How to import a json file to MongoDB - mongoimport

I have some problem to import a json file to mongo.
Here is what i wrote:
mongoimport --db tst --collection stocks --type json --file //home/ubuntu/workspace/stocksNsdq.json --jsonArray
This is the error that i got:
2014-08-23T15:05:09.569+0000 SyntaxError: Unexpected identifier

Related

How can we store airbyte connectors configuartion data?

Need to store the airbyte - connectors , sources & destinations configurations data as a backup .
Had an approach to use Octavia , but facing issue while downloading data for specific connection.
like this :
octavia import connection [connection-ID]
error: TypeError: _from_openapi_data() missing 3 required positional arguments: ‘schema_change’, ‘notify_schema_changes’, and ‘non_breaking_changes_preference’
Is there a version issue or any other
I've updated the version to Octavia 0.40.18
curl -s -o- https://raw.githubusercontent.com/airbytehq/airbyte/v0.40.18/octavia-cli/install.sh | bash
alias octavia="docker run -i --rm -v \$(pwd):/home/octavia-project --network host --env-file \${OCTAVIA_ENV_FILE} --user \$(id -u):\$(id -g) airbyte/octavia-cli:0.40.18"
and added these variables to .octavia file.
nano ~/.octavia
OCTAVIA_ENABLE_TELEMETRY=True
AIRBYTE_USERNAME=airbyte
AIRBYTE_PASSWORD=password
& used all octavia commands.
mkdir airbyte-configs-data && cd airbyte-configs-data
octavia init
octavia import all
and removed params from connections/configuration.yaml:
schema_change
notify_schema_changes
non_breaking_changes_preference
geography
octavia apply
Thanks

ModuleNotFoundError: No module named 'maskrcnn_benchmark'

I am doing MLperf, object_detection project test.
https://github.com/mlperf/training/tree/master/object_detection
Question 1:
When doing:
nvidia-docker run -v .:/workspace -t -i --rm --ipc=host mlperf/object_detection \
"cd mlperf/training/object_detection && ./install.sh"
It responses:
docker: Error response from daemon: create .: volume name is too short, names should be at least two alphanumeric characters.
I need to change -v .: to -v $(pwd):/workspace
Question 2:
When applied to the modification above, I got a new error:
docker: Error response from daemon: OCI runtime create failed: container_linux.go:345: starting container process caused "exec: \"cd mlperf/training/object_detection && ./install.sh\": stat cd mlperf/training/object_detection && ./install.sh: no such file or directory": unknown.
It seems docker can't accept a string with space, ex: "cd xxxxxxx && ./install.sh"
If I modified the string to single command (./install.sh)
nvidia-docker run -v $(pwd):/workspace -t -i --rm --ipc=host mlperf/object_detection \
"./install.sh"
This will work, it doesn't look like an incorrect path problem, I tested to use an absoluted path it got the same error.
Question 3:
After followed the steps in the webpage, I always got an error:
ModuleNotFoundError: No module named 'maskrcnn_benchmark'
root#nvme:/markkang/mlperf/training/object_detection# nvidia-docker run -v $(pwd):/workspace -t -i --rm --ipc=host mlperf/object_detection "./run_and_time.sh"
/workspace/pytorch /workspace
Traceback (most recent call last):
File "tools/train_mlperf.py", line 8, in <module>
from maskrcnn_benchmark.utils.env import setup_environment # noqa F401 isort:skip
ModuleNotFoundError: No module named 'maskrcnn_benchmark'
Edit train_mlperf.py and insert the following path code before the invocation of maskrcnn_benchmark.utils.env, e.g.
import sys
sys.path.append('/workspace/pytorch/')
from maskrcnn_benchmark.utils.env import setup_environment # noqa F401 isort:skip

twitter error on rails console while using twurl api

I am trying to upload a media on twitter with below command from my rails console .
ActiveSupport::JSON.decode(twurl -H upload.twitter.com
"/1.1/media/upload.json" -d
"command=APPEND&media_id=1037245714940747777&segment_index=0" --file
/home/administrator/Downloads/s.mp4 --file-field "media" -t) and
Getting the error-: JSON::ParserError: 822: unexpected token at ''
I am getting error of JSON parsning, Please help me to solve this problem.
That's command is not for Rails console, it's for bash shell command line, you can just paste it to the Rails console and run. You can run it from bash shell:
$> twurl -H upload.twitter.com "/1.1/media/upload.json" -d "command=APPEND&media_id=1037245714940747777&segment_index=0" --file /home/administrator/Downloads/s.mp4 --file-field "media" -t

Populating Neo4J In Docker Container

I created a Docker container for Neo4J using
$ docker run --name my-neo4j -p 7474:7474 -p 7687:7687 \
-v ~/path/to/volume1:/data:rw -d neo4j:3.0.6
This, as expected, creates an empty graph database. For our use case, we'd like to have a pre-populated database.
Any help would be appreciated - preferably command line options, if any.
Thanks in advance.
You can have a look at the example at https://dzone.com/articles/how-neo4j-data-import-minimal,
$ docker exec -i my-neo4j bin/neo4j-import --into /data/databases/person.db \
--nodes:Person /data/people.csv \
--relationships:KNOWS /data/friendships.csv
This will populate the person.db. To populate the default graph.db,
$ docker exec -i my-neo4j bin/neo4j-shell < ./local/path/to/setup.cql
where the setup.cql looks like
LOAD CSV WITH HEADERS FROM 'file:///data/people.csv' AS line
FIELDTERMINATOR ','
MERGE (p:Person {person_id:line.personId, name:line.name});
USING PERIODIC COMMIT
LOAD CSV WITH HEADERS FROM "file:///data/friendships.csv" AS line
FIELDTERMINATOR ','
MATCH (p1:Person),(p2:Person)
WHERE p1.person_id = line.personId1 AND p2.person_id= line.personId2
CREATE UNIQUE (p1)-[r:KNOWS]->(p2);
The 2 CSV files, people & friendships, have headers personId,name and personId1,personId2 respectively.

restore mongodb database .bson and .json files

In this folder called my_backup I have a mongodb database dump with all my models/collections for example:
admins.bson
admins.metadata.json
categories.bson
categories.metadata.json
pages.bson
pages.metadata.json
.
.
.
I have a database called ubuntu_development on mongodb. I am working with rails 3 + mongoid
How can I import/restore all models/collections from the folder my_backup to my database ubuntu_development
Thank you very much!
Execute this command from the console (in this case):
mongorestore my_backup --db ubuntu_development
mongodbrestore is followed by my_backup, which is the folder name where the previous dump of the database is saved.
--db ubuntu_development specifies the database name where we want to restore the data.
To import .bson files
mongorestore -d db_name -c collection_name path/file.bson
Incase only for a single collection.Try this:
mongorestore --drop -d db_name -c collection_name path/file.bson
To import .json files
mongoimport --db db_name --collection collection_name --file name.json
You have to run this mongorestore command via cmd and not on Mongo Shell... Have a look at below command on...
Run this command on cmd (not on Mongo shell)
>path\to\mongorestore.exe -d dbname -c collection_name path\to\same\collection.bson
Here path\to\mongorestore.exe is path of mongorestore.exe inside bin folder of mongodb. dbname is name of databse. collection_name is name of collection.bson. path\to\same\collection.bson is the path up to that collection.
Now from mongo shell you can verify that database is created or not (If it does not exist, database with same name will be created with collection).

Resources