How to migrate my local neo4j dataset to the grapheneDB instance - neo4j

I am currently working on a project and It's time for me to host both the application and my graph database. I have chosen heroku and I have been able to deploy my application, add an add-on (GrpaheneDB). Now I would like to migrate my local dataset on my online database. I have been searching for two days now.
Every time I try restoring the database, I get this error:

To quote from GrapheneDB's troubleshooting section for importing:
When a restore process fails, it’s usually due to one of the following
reasons:
The store files were copied while Neo4j is still running: Make sure
Neo4j is stopped.
The store files correspond to a newer version of
Neo4j than the one on GrapheneDB: Make sure you restore to the same
version or higher.
The compressed file is not a supported format: Make
sure you use one of our supported formats, which include zip, tar,
cpio, gz, bz2 and xz.
There are store files missing within the
compressed file: Make sure the archive contains the full graph.db
directory and all files inside (use the recursive option when creating
the archive).

Related

How can I move my appwrite image alongside all the projects to a seperate computer?

I am new to this whole containarization and backend as a services technologies. But I wanted to give appwrite a shot because it seemed very easy and well suited for a small project I am about to build. The only problem is I donot know that much about docker, and I am a bit unsure if and how will I be able to move the appwrite image instance that is running locally with all the changes that I have made to it (i.e. created projects, existing db documents, functions etc) to production server or any other computers. How might I be able to do this? thanks
If you're looking to move the configuration for your project AND the data, the best thing to do would be to:
Backup your project
Move the backups and the appwrite folder to the new location
Start Appwrite
Restore the backup
If you only need to migrate the schema for your collections, you can use the Appwrite CLI to create an appwrite.json file of your project and then deploy it to another instance. The CLI can also be used to manage your Appwrite Functions too.
References:
YouTube Video on Backing Up and Restoring
Docs on Backups
Docs on Appwrite CLI

Way to export neo 4j graph to rebuild the database

I'm working with the neo4j graph database these days for a project and as a precaution, I need to find out that is there a way to export the graph every time I build or make a change in the graph in case an accidental deletion occurs I can rebuild the graph. For example, in MySQL, we can export the database into a SQL script and rebuild the database by running it. What I'm asking is, is there a way in neo4j to do the same thing?
PS:- I use an online sandbox provided by graphenedb.com. Not the one installed locally in the computer.
You could use the export feature. That export file can be used within GrapheneDB to restore it into any other database or to take your data elsewhere.

What do all the options on GetOptions mean?

The MSDN documentation lists four options, with limited explanation:
Overwrite "Overwrite existing writable files if they conflict with the downloaded files." Does this apply to all files, or just ones we've told TFS we've edited?
GetAll "Gets all files." What files does TFS not normally get?
Preview "Executes a get without modifying the disk." This one seems pretty clear.
Remap "Remaps existing items on the disk to the server items where the content and disk location are not changing." I have no idea what this means.
Overwrite: will blindly overwrite writable files that you have not pended for edit. If you have marked a file as 'writable' then you have violated the contract with TFS and it assumes that you have done this for a good reason (eg, modifying the file without taking a checkout, because you were working offline). This will generally produce a writable conflict on the file, but if you specify this flag, then the writable file will be overwritten.
This only applies to server workspaces (local workspaces are always writable). This has no effect on files that you have pended for edit. Get will always produce conflicts for files that are edited locally and updated on the server; if you want to update files that are checked out, you must undo the checkout (or resolve the conflict with TakeTheirs).
Get All: will download every file and update it, even if TFS believes that the local version is the same as the remote version and that downloading a new version would be a noop. TFS tracks every version that you have locally, as well as remotely, so this is only useful if you edit files locally without checking them out.
If you have kept them writable, then then - as mentioned above - this will be a writable conflict. If you have then marked them read-only then TFS assumes that you have not made any changes and will not bother updating them when you do a get (because it knows the file contents haven't changed). If you have manually changed the file contents, then marking this will update those files to the server version.
Preview: will just fire events and provide results that indicate what would be downloaded with the given parameters.
Remap: is a clever option that allows you to perform an in-place branch switching (which is very common with some version control systems that branch at the repository level - like Git - but somewhat complicated in TFVC.)
Consider that you have mapped $/Foo/main to C:\Foo, and done a get latest. If you update your working folder mappings so that $/Foo/branches/feature now points to C:\Foo, then issue a get with Remap, then the server will download only the changed files between main and branches/feature, so it's an inexpensive way to update your local workspace to a feature branch.
(If you're looking for an example, this functionality exists in the command-line interface and in Team Explorer Everywhere but not in Visual Studio.)

TF400018: Local version table locked

Got the below in TFS and VS 2012 RC, anyone know of a fix? Doesn't seem to exist on MS website.
TF400018: The local version table for the local workspace
COMPUTERNAME;MYNAME could not be opened. The process cannot access the
file because it is being used by another process
Any suggestions welcomed.
We experienced this one as well. Migrating to the RTM makes this happen a lot less, but it can still happen a lot.
When using local workspaces (a new feaure in vs 2012) a local file based database is created to administer changes you make localy. When you change a source file, this file base database needs to be updated. If this update conflicts with the normal update task which routinely checks for changes you get this error. The cause of this issue is usually that you are using local workspace for more items than it was intended or that your disk I/O is too slow.
Workarounds for this are either:
Replace your disk with an ssd. Having better I/O makes this issue
happen a lot less.
Switch back to server based workspaces. (which handles this better)
Use the TFS-GIT connector and use git for offline support.
Split your workspace mapping in portions so they contain less items.
Please delete the files under %Temp% folder and open the project as
"Run as Administrator " mode .It works for me .
Regards,
Kamaraj

Is there any simple automated way of finding out all the source files associated with a Delphi project?

I like to backup up the source code set for a project when I release a version. I use GExperts project backups, which seems to gather up all the files in the project manager into the ZIP file. You can also add arbitrary files to this file set, but I'm always conscious of the fact that I haven't necessarily got all the files. Unless I specifically go though the uses clauses and add all the units I have sources for to the project, I'll never be sure of storing all the files necessary to recreate the installable/executable.
I've thought about rolling an app to traverse a project, following all the units used and looking down all the search paths and seeing if there is a source file available for that unit, and building a list of files to back up that way, but hey - maybe someone has already done the work?
You should (highly recommend) look into Version Control.
e.g. SVN (subversion), CVS
This will allow you to control revisions of all of your source. It will allow you to add or remove source files, roll back merge and all other nice things related to managing project sources.
This WILL save your a$%# one day.
You can interpret your question in two ways:
How can I make sure that I backup at least enough files so I can build the project
How can I make sure that I backup not too many files so I can still build the project
The first is to make sure you can build the system at all, the second to allow you to clean up unused files.
For both, a version control system including a separate build system is the way to go.
You then - for each new set of changes - can use these steps to assure that both conditions hold:
On your daily development system, check in the new revision of your source code into your version control system.
On your separate build system, get the latest version of your source control system.
Build the project on the build system; if this fails, go to Step 1, and add the missing files to your version control system from your development system
Start removing (one-by-one) files from the project that you suspect are not needed, then rebuild until it fails.
When the build fails, restore that particular file from the version control system, then continue step 3 with the next candidate
When the build succeed you have the minimum set of files.
Now make a difference overview of the files in your version control system, and the build machine.
Mark the files that are in your version control system but not on your build machine as deprecated or deleted.
Most version control systems have good ways of generating a difference between the files on your development or build system against the files in the version control system (usually fine grained for each historic point in time you added/removed/updated files in your version control system).
The reason you want a separate build system (or two separate development systems) is that you want them to be independent: you use one for developing, and the other for checking if the build is still OK.
This is the first step that in the future you might want to extend this into a continuous integration system (that runs unit tests, automatically creates product setups and much more).
--jeroen
I'm not sure if you're asking about version control or how to be sure you've got all the files.
One useful utility I run occasionally is a program that makes a DirList of all of the files in my dcu output folder. Changing the extensions from .dcu to .pas gives me a list of all of the source code files.
Of course it misses .inc files and other non-.pas files, but perhaps this line of thinking would be helpful to you in some way?
The value of this utility to me is that a second housekeeping utility program then makes a list of all .pas files in my source tree that do not have corresponding .dcu files. This (after a full compile of all programs) generally reveals some "junk" .pas files that are no longer in use.
For getting a list of all units compiled into an executable, you could let the compiler generate a MAP file. This file will contain entries for all the units used.

Resources